The only difference between being laughed at and laughed with:
- whether or not you are laughing, too.
The only difference between being laughed at and laughed with:
A complicated subject: should children even watch TV? If so, how often?
And what to watch?
My children are the center of my universe, and I think carefully about everything that goes into their tummies, brains, and gigantic little souls. That sounds really clichéd, and kinda overwrought, but wait until you have children. Wow.
So I can make a few recommendations, with the caveat that all of these shows are available (or not) on multiple platforms (even free on YouTube, sometimes), so seek and find at your discretion:
…when you get your first job as a software developer and begin the grinding climb to competency. It’s been worth it, but it’s been a long row to hoe.
It is popular to assume that nefarious forces are at work when our government does things that we strongly disagree with. This conviction holds true across the spectrum of political ideologies. But I’m convinced that most people in government have good intentions. Which brings to mind an old saying, something about how the road to hell was paved…
Anyway, things like government are almost always more complicated than most people seem to realize. So perhaps those who critique the size of our government have a point. Maybe we live in a world where things have gotten too complex for us to grasp the big picture, or even a smaller component part of it.
But I never believed that one must be evil in order to do evil things.
Data visualization is a fascinating thing. It can enhance our understanding of reality by modeling incredibly complicated things in a manner that makes them “touchable” or “graspable”.
There are different types of data visualization, a fact that corresponds precisely to one of my favorite themes:
1) reality is an incredibly complicated thing, and faceted like a gemstone;
2) we are only able to apprehend/comprehend one facet at any given moment of awareness;
3) if we make the effort to understand all of the various facets, one by one, that we can begin to have an intuitve grasp of the larger and more complex structures of meaning and connection that lay underneath the surface of our day-to-day realities.
I’ve proferred a simplistic model, to be sure, but this article does a very nice job of showing how data visualization is being used to understand 3-dimensional levels of complexity that are almost too much for our brains to handle. I like it when we use our IT tools to understand the facts better, rather than distort and lie and sell crap to people who aren’t paying proper attention.
Alexia Tsotsis is co-editor at TechCrunch, a silicon-centric web-zine. And she’s a damn fine writer. I re-post an excerpt from a recent article on the New Gilded Era:
“Silicon Valley is suffering from an acute fallacy of composition: Just because it does some good doesn’t mean the whole is good. Tech isn’t above harming society. Just because change (i.e. Disruption) is inevitable doesn’t mean it’s always welcome.
Machine guns were innovation. They Disrupted muskets. They also Disrupted a lot of human bodies in World War II. Pharmaceuticals save lives. But they also let people numb emotional pain rather than face it, quiet their children rather than teach them. Social games can be seen as entertainment and relaxation. They can also be seen as dehumanizing thieves of our time and attention.
The tech sector is particularly ill-suited to address its own footprint, staving off its rich guilt with the misguided belief that it lives in a meritocracy. Hell, even the people who blog about it are rich.
Like the problem of technology replacing jobs, there is no solution to technology’s feigned innocence. As nerds and underdogs, we will always believe we have the best intentions. That doesn’t negate the problem: Even though we’re not Washington D.C., we are still an industry with absurd amounts of power, attention and money. And plenty of intentional and unintentional opportunities to abuse it.”
A few points:
Technology empowers us. Sometimes, technology frees us. Questions remain: what do we do with that power, and with that freedom? How do we spend our time, those of us lucky enough to live in a society where much of the nasty stuff was abstracted away before we were born? How are we empowered, and how much? Who is more empowered, and who less? And perhaps most importantly: how free are we, really?
George Orwell wrote of a possible future that seemed plausible at the time, and there are certainly pockets of the human world where the overt surveillance and routine brutality of totalitarian control are the norm. However, though Orwell was the perhaps the better writer, Aldous Huxley was the more prescient imaginer. “Brave New World” was a distant early warning for anyone who has ever watched more than a few hours of TV per week, or taken prescription mood-altering medications, or slowly drowned in a bottomless glass of booze. I’ll let a better writer say it. From Neil Postman:
What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egotism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture…. As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny “failed to take into account man’s almost infinite appetite for distractions.” In 1984, Postman added, people are controlled by inflicting pain. InBrave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we fear will ruin us. Huxley feared that our desire will ruin us.
I am set to receive my Associate Degree in Computer Programming from Gwinnett Technical College in a few weeks. It’s been quite a ride. I would like to say that I don’t recommend it, but that would not be true, and it would never be that simple anyway.
Though technical/community colleges do not get the respect that 4-year universities do, the fact is that some programs, at some tech schools, require a much higher level of intellectual engagement and sheer grinding work than perhaps half of all major university programs in the country. There are some very smart, very driven people coming out of tech school right now. I know some of them. I’d hire them in a heartbeat.
I’ve been over-scheduled and over-worked for a couple of years now. The sleep deprivation, and the attendant emotional and physical exhaustion, have been intense, sometimes to the point of badly affecting my health and safety.
So, I’m tired.
And I just got a job, one that pays real money. It’s nothing glorious, and it might not last much more than six months. And it doesn’t involve moving out to California and to join the tech elite, thank god. It’s really just a really well-paying internship, honestly, but one where I’ll be doing something I like to do, and something that promises to offer a lot of opportunity in the near future. I’ll be studying JavaScript/JQuery, .NET and a few other things over the next month or so in preparation for my first day on May 5. It’s another startup, so it might not last (for any of us), but it’s a win-win for me, because the first coding job is usually the hardest one to find, and my graduating classmates and I are realizing that we are not quite prepared for a full-on job in this field: we need mentorship, we need internships, and we need real-world experience. So getting the opportunity to work for, and learn from, successful professionals is a dream come true, really, and if it only lasts six months, then I’ll be ready for the next step by then. Guaranteed.
So I’m putting in my notice at Big City Bread Cafe, where I’ve managed (and learned) for four years, and at Vitamin C Software, where I’ve interned (and learned) for three months. And I’ll spend the next 3-4 weeks finishing school and studying for my new job. And I’ll sit back and think for a minute, now and then, about all that I’ve been through, all that I’ve seen and learned and done these last few very intense years. And then Jonah and his mommy will get home and I’ll go outside and run around in the yard with my son and I’ll be shocked, again, that I could ever feel so much joy.
As a child, I didn’t understand this metaphor of “the shadow of death”. I suppose I had some vague notion of God protecting me from physical danger. With time I came to understand the truth. My confusion was not simply that of a child. Metaphors and parables are trickier than they seem. They touch on something innately paradoxical about us, and about the way we think.
It is said that we are defined by our choices. The problem is, most of us spend most of our lives not realizing how many choices we make every day.
Carl Jung spoke of the persona: the face that we present to the world, the self we imagine ourselves to be, the self we want others to believe us to be.
One lesson to take from Jung’s writings: lying to others is really only possible if you are already lying to yourself. That is a dangerous game, and one played by far too many people.
Another lesson Jung offers is this: good actors are doing something magical and self-transformative. The wearing of masks, or the donning of other personae, is an ancient and honored part of human culture. The urge to do so lies somewhere near the oldest part of our human brains. The trick is to wear the mask consciously and conscientiously, not in an ongoing state of existential panic.
Carina Chocano writes:
In starting to lay out the possible uses of regret, [Janet Landman, author of Regret: The Persistence of the Possible,] quotes William Faulkner. ‘The past,’ he wrote in 1950, ‘is never dead. It’s not even past.’ Great novels, Landman points out, are often about regret: about the life-changing consequences of a single bad decision (say, marrying the wrong person, not marrying the right one, or having let love pass you by altogether) over a long period of time. Sigmund Freud believed that thoughts, feelings, wishes, etc, are never entirely eradicated, but if repressed ‘[ramify] like a fungus in the dark and [take] on extreme forms of expression’. The denial of regret, in other words, will not block the fall of the dominoes. It will just allow you to close your eyes and clap your hands over your ears as they fall, down to the very last one.
Not surprisingly, it turns out that people’s greatest regrets revolve around education, work, and marriage, because the decisions we make around these issues have long-term, ever-expanding repercussions. The point of regret is not to try to change the past, but to shed light on the present. This is traditionally the realm of the humanities. What novels tell us is that regret is instructive. And the first thing regret tells us (much like its physical counterpart — pain) is that something in the present is wrong.
The take-away: whistling in the dark is no good if you are trying to distract yourself from how scared you are. Only whistle in the dark as a means of echo-location, as a way of finding your way through the darkness. Pretending everything is okay when things are clearly wrong is not bravery: it is cowardice.
Jung also believed that people are at their most hopeless and most desperate right when they are ready to shed an old version of themselves and start anew. But that is precisely when most of us refuse to let go of our older selves, and so avoid the process of change. That is “the shadow of death”, the feeling of dread that you are about to be extinguished, that all the things you are, and all the things you hold dear, are about to be ruined.
So: put on a face for the world, but fill it full of you, not of the self you imagined as a child. Wear your mask with full awareness of what you do. Let it be a conscious choice. Don’t get trapped by the idea of who you thought you had to be.
And don’t tell yourself that you are not afraid of the dark, or that the darkness is not even there. You’re not fooling anyone.
From Justin St. Clair, in a recent book review:
“Suddenly you’re wondering what the hell Pynchon was doing in the Quad Cities.”
I know the feeling well. Not that I was ever in the Quad Cities myself: I’ve always skirted the Midwest in my travels. But it seems that the reclusive (yet peripatetic) Thomas Pynchon has been there: he included a tiny detail about it in his latest book, “Bleeding Edge”. That’s one of Pynchon’s hallmarks, to throw a bewildering array of real-world details at his works of fiction. The word for that trick, or nervous tick, is “verisimilitude”, which is the idea that if a narrative artists includes in their work enough bits and pieces of the world as they themselves have experienced/seen/imagined it, then that will give their art the feeling of “being real”, of being grounded in something resembling consensual reality. This is a trick Becker and Fagen used to great effect in so many Steely Dan lyrics.
Thomas Pynchon gets around. Or he used to, back in the day. He is, by all accounts, a bit more settled now. But perhaps he has notebooks full of little vignettes, tiny bits of set and setting left over from his rambling days, stacks of “scenes” jotted down years ago, all carefully filed away for future use the way an old professor stores notes and observations in a filing cabinet near his desk.
And so some arcane little detail about Mr. St. Clair’s home town makes its way into a new book by one of the greatest legends of modern fiction, and Mr. St. Clair is a little creeped out by it. I have no sympathy: one should know what one is getting into by now, if one has any sort of relationship with the novels of Thomas Pynchon. Some things are not for the faint of heart. You but your ticket, you take the ride.
I read “Gravity’s Rainbow” years ago, very quickly the first time, then again at a slower pace. It was mind-blowing, the searing kaleidoscope of words, the staggering amount of imagery and detail and insight cascading into a blur of cognition and buzzy confusion. There were patterns to it that I discerned but didn’t quite grasp , patterns that I still don’t know to be entirely comfortable or even safe to comprehend. I got truly paranoid at one point, lurking in the library in old Savannah, looking over my shoulder once or twice to see of anybody saw me sifting through old books and periodicals. I was trying to verify how much of Pynchon’s outrageous yarn was “true” and how much was “fiction”. This sounds odd, I know, but only if you’ve never read that particular book.
“Bleeding Edge” is at the top of my reading list, at any rate. I hope to finish with school by summer semester, 2014, and then maybe I’ll have time for the odd bit of pleasure reading.
We went to the beach in late July. It looked like this:
It was lovely.
The whole family was there, so it was lively, too.
I got stung by a jellyfish, no big deal. I hear it gets worse, depending on the jellyfish. I spend a lot of time in the water when at the beach, so I was bound to get stung sooner or later.
Jonah gradually acclimated to the vastness of the ocean, and by the end he was pretty comfortable. But you can’t blame a tiny man for feeling a little intimidated by such a deep, noisy beast.
That’s my head out there in the waves. I like it out there. It’s soothing. Except when getting stung by jellyfish, of course.
Jonah soon learned the wisdom of being prepared. Or of being accessorized. Whichever.
And we finally got a pretty good family portrait.