Give Thanks to the Time Lord

Standard

Exactly 100 years ago, an alien mind inside an unusual man opened the eyes of humanity and forever changed the way we see the universe. His name has become synonymous with Time and Relativity and Space, yet his face has appeared to the world in many different forms.

No, I’m not talking about The Doctor. At least not the one from Gallifrey.

Much like the Tenth and Eleventh Doctors, though, Doctor Einstein was also a hipster icon of his day. He may not have worn Chucks with a suit or made bow ties and the fez cool, but he did rock a thrift-shop wardrobe, some seriously crazy hair, and a retro bicycle as his regular ride.

Einstein was at times a frustrated genius, burdened – like the first few Doctors – by the inability of mere mortals to keep up with his intellect. He could also be patient and calm, like the Seventh Doctor, or grumpy like the Twelfth, and occasionally scare the shit out of people, much as the Fourth Doctor traumatized my childhood (no thanks to my older brother’s TV habits).

Like the Ninth Doctor (and the uncounted incarnation before him), Einstein was often a man reeling from and railing against war. But he also never lost his sense of whimsy – his inner Sixth Doctor – or (for some of us weirdos) his romantic appeal, like the Eighth.

Doctor Einstein saw and solved problems no one on this planet even conceived of, by keeping his mind open and letting his imagination lead the way. He quite often failed on the way to success, and knew that sometimes it is best to blow things up in order to put them back together much stronger. And 100 years ago, over the four Thursdays in November, 1915, he presented to the world his theory of General Relativity – quite literally creating the fabric of space-time with his mind.

I like to think of Einstein as the Zero-th Doctor – because any mathematician knows that counting really starts at zero, not one.

So today, as we celebrate family and friends, eat delicious meals, and give thanks for all that we have in this universe, spare a thought for the wonders of science. For the spaces we gather to share food, the gravitational pull that draws us together, and the time that slows down when life is really good.

And give thanks for Doctor Einstein, who opened our eyes to it all 100 years ago today. He is truly the original Time Lord.

Advertisement

Time Weights For All Man

Standard

AlTime is a flat circle. No, wait; Time is an increasingly desperate “news” magazine. Time is an herb to the blind and spelling-impaired?

Time is all of these things, and also none of them, for time – like color and popularity – is merely a product of our own perception.

We know this because 110 years ago a bored 26-year-old patent clerk started daydreaming and ended up having the best year of his – or of anybody’s – life. It’s a good thing there weren’t more people inventing things in Bern around 1905.

Exactly 110 years ago this month, that bored patent clerk – who was still bored even after publishing a paper in March that would become the foundation for quantum physics and another in early May about Brownian motion – started thinking about Galileo and Newton, and how the motion of objects is relative to the motion of an observer. He then asked a question no one else ever had: What about light?

That got him daydreaming about speeding trains, and the universe was changed forever. Literally.

[What is it with men and their fascination with vehicles? Galileo and Newton determined relative motion by thinking about boats (while a man walking on a ship’s deck may travel 10 meters in 10 second from his own perspective, he travels much farther to a person on shore also watching the ship sail by), and Einstein used trains to show that a beam of light bouncing from floor to ceiling travels one distance to an observer on the train but a longer distance to an observer watching the train from a hill. Boys and cars, man. It’s genetic.]

By the end of June, 1905, Albert Einstein had published his Special Theory of Relativity, which stated that if light truly does always move at a constant rate (which experiments had shown but scientists had been reluctant to accept), then time must be just as relative to the observer as distance and motion and acceptable fashion standards (shoulder pads, anyone?).

Suddenly, a clock was nothing more than a series of countable moments; a second merely an agreed-upon unit that only stays consistent so long as we remain still relative to the time piece. As soon as we accelerate that clock out the window, those seconds get longer. So…time doesn’t fly as it flies.

Our bodies already know this; the heart is a clock, beating out a series of countable events, and the faster we move the slower time progresses. Stay active, stay young.

Before 1905 played out, Einstein managed to blow minds open one more time by proving that mass and energy are on the same spectrum (or, as it is more commonly known, that energy (E) equals mass (m) times the speed of light (c) squared). This equation gave us the power (nuclear power) and Einstein the ability to reach his most influential deduction of all – which, given his work thus far, is certainly saying something.

If light is energy, he thought, (paper one) and energy has mass (paper four), then light has mass and should be affected by gravity (it is – eventually, in 1919, a solar eclipse allowed experimenters to prove that light does indeed bend its path when traveling past a large body such as the sun). And if the path of light is bent by gravity, Einstein continued, then so must time be affected (paper three).

It took a decade to work out the math, but 100 years ago this November Albert Einstein was able to present his General Theory of Relativity, which tied the fourth dimension (time) to the three we already knew so well (space) to introduce the idea of Space-Time as the fabric of our universe. A fabric that, like a fabric should, gives and curves around heavier objects. The larger a mass, the more it tells both space and time to “get bent”. That’s gravity.

So if time, like light and space and anything else with mass, is affected by gravity, it makes sense that time itself has mass. Finally! That explains the Sunday evening doldrums, when the weight of the weekend that hangs behind us requires a Herculean effort of will to drag into Monday morning.

It also explains why, as I try to fall asleep some nights, I can physically feel those ounces of time passing through me – from future, to present, to past – adding their weight to the ever-increasing mass of time that lies behind. One. Heartbeat. At. A time. How much farther must I carry that weight toward the unknown destination in my future? Can I keep moving forward as it keeps getting heavier every day? At what point will it weigh too much and drag me to a complete standstill – or backwards?

On these nights, I find it comforting to remember E=mc2 and the fact that the accumulating mass of my past also increases its potential energy. The longer it takes me to get…wherever, the brighter I can burn when I do.

Other times, I just roll over and find a fuzzy cat ass in my face. A cat ass in the face is pretty much the best life has to offer anyway, so what’s my hurry?

Logical Mystery Tour

Standard

Once upon a short time ago, I spent over twenty minutes arguing with a Time Warner Cable representative about how math works.

My monthly cable bill had suddenly increased by $7 (increased again, I should say, because this was not the first time), so I had looked and found a new $7 charge listed for the modem. (The modem I had been using for no charge since…always.)

The TWC representative tried repeatedly to convince me that they had always been charging me $7 for the modem, it’s just that now they were listing the fee as its own line item on the bill. I replied that if that were true my bill total would not have increased (because, math), but it had increased, so there was clearly a new charge for something, and would she please just fess up to it already.

After twenty minutes of our own little version of Waiting for Godot (“I recognize that tree!”) she finally succumbed to the power of how numbers work and agreed there was a new fee. I agreed to no longer be a Time Warner Cable customer.

While I appreciate that this woman provided the kick I needed to finally bail on cable, our conversation makes me want to bang my head against a wall. For six years, I have spent much of my time helping adults prepare themselves for the rigors of law school, and in that time I have been repeatedly surprised and disheartened – as I was on that phone call – with the general lack of logical reasoning employed by humanity.

Logic is important, even if only to save us from Kafkaesque conversations and murderous thoughts. If we used it more, our civilization would be in a much better place.

For one thing, logic allows us to recognize when people (and cable companies) are lying. It demands reasons and facts be given to support arguments – including our own. With logic, we also recognize when a statement is technically true (“That Awkward Moment is the #1 comedy of the year!”) but essentially meaningless (“Dude, it’s still January”).

Even more relevant to our current state of debate, logic helps us stay focused on the actual point, instead of getting distracted by more convenient statements that are off topic. Sure, mental health and how we treat it is a major problem in the world, but it isn’t a relevant rebuttal to “I think there should be more gun regulation,” any more than “vegans are annoying” addresses whether we should let the pregnant pigs move around, or “I hate science” is an argument against global warming.

Most importantly, though, logic is vital because it exercises a skill that is crucial to human success: creative thinking.

It is no coincidence that Einstein was a skilled violinist while Hitler was a bad painter; creativity and reason go hand in hand. To be logical is to be able to mentally entertain as many possibilities as can be imagined and then evaluate them against whatever facts are known. It is to know that there was a mass extinction of dinosaurs, imagine the infinite reasons it could have happened, and use the evidence of meteor strikes, lack of evidence of spontaneous combustion, and miniscule likelihood of alien invasion to conclude that most likely the meteors were the culprit.

(It is also to know that the limited facts demand language like “most likely” instead of “of course it happened that way, how dare you question me?!” or “I don’t believe you so no it didn’t!”)

Logical thinking trains us to have flexible minds, which is the ultimate reason it needs to be more prevalent in our world today: because mental flexibility is the key to empathy. Yes, it also helps if we have and understand emotions, but empathy by definition requires the ability to think beyond our own personal situation.

In college, I was once asked by a boy (he was a boy in every sense) why I was pro-choice; to answer him, I started by saying, “given my own health issues, I can certainly imagine why someone might need-“ and he cut me off by rebutting, “It’s not about YOU. You’re so selfish.”

His statement was technically true – it wasn’t about me – but meaningless, because it WAS about my ability to put myself in another person’s shoes; to imagine circumstances that, while not true for me, may be true for someone in a different place or time or dimension.

A rigid “I would never” is not enough to close the book on any subject. That’s great that we would never; it is completely our right to choose to “never” – but somebody would, and shouldn’t we at least take the time to explore and understand their reasons before we judge?

Without empathy, progress can only happen once everyone personally knows a victim of sexual assault, a minority being denied rights, a dark-skinned person who has suffered harassment by those in authority, or someone forced to make a bad choice in a bad situation. Of course, the sad fact is, everyone already does.

That some people still refuse to acknowledge it defies logic.

Cogito Ergo Numb (A Brief History of Nerds)

Standard

The concept of Cool Nerds is by definition oxymoronic. Yet here we find ourselves, in the Age of the Geek where – to paraphrase basically every TV executive in the last decade – “nerds are totally in right now”.

As I stand on the outside of the new nerd In Crowd, I have been facing a bit of an existential crisis. Am I not nerd enough? Am I some Uber Nerd who is doubly ostracized? In truth, it is mostly an isolation of my own making, so I did some research to understand my reluctance toward being cool – which was itself a really nerdy thing to do.

As a word, “nerd” hasn’t been around for very long. Dr. Seuss used the term as a nonsense name for an imaginary creature in 1950’s If I Ran the Zoo, but it didn’t get attached to the traditional concept of a machine-like intellectual until the mid-sixties, on East-coast college campuses. It basically took over for the word “tool” (which literally meant one who carried the tools of the nerd trade, like slide rules and pencil protectors), which itself had replaced the word “grind” (as in “nose to the grindstone”). “Nerd” finally became the popular label for the brainy crowd in 1977, thanks to Gilda Radner, Bill Murray, and the SNL sketches featuring their nerds.

[For all of this awesome history and more, I recommend you read Ben Nugent’s book “American Nerd”. I have twice. It is wonderful.]

While the label is relatively new, the concept has been around much longer. The idea of the person who loves science and prefers rules and “ratiocination” (logical thought and argument) to ambiguity and innuendo, who is direct and precise with language to the point of being viewed as blunt, tactless, or rude, is found throughout literature and history. Mary Bennet in Pride and Prejudice is one. Thomas Jefferson was one. I am absolutely one. But why and how did this become uncool?

Like so many other wonderful things in human society, the full-blown idea of the uncool nerd was born from a combination of fear and bigotry. The grandfather of all the nerds – Nerd Prime – is Mary Shelley’s Dr. Frankenstein, whose obsession with science completely isolates him from love and family and does not end well for anyone. The novel is a cautionary tale about the dangers of focusing on logic over emotion, sprung from the fear of Romantics like Shelley who valued feelings over all.

Then came Victorian traditionalists, who lamented the increasing value of technology and strategy in warfare over brute force. They mourned the passing supremacy of the warrior / knight, and – in a classically American twist – despaired at the influx of immigrants from races and cultures more stereotypically inclined toward intelligence (or at the very least ‘book learnin’).

Put these together and the result is a societal agreement that an affinity for logic, rules, structure, and process (all things machines and role playing games offer in spades) is separate and distinct from emotional awareness, interpersonal skills, or physical prowess. Humans as a group flipped the defining characteristic of humanity from “reason” (which had separated us from the animals) to “emotion” (which separates us from machines), and bought into the idea that a person could not be skilled at dealing with both things and people.

The glasses, bad clothes, and dorky laughs got slapped onto the image shortly after.

Thus, the concept of “nerd” came to be synonymous with “abnormal”, the perpetual clash between nerds and jocks launched into almost every aspect of society (both Tom Wolfe and Paul Feig – the creator of Freaks and Geeks – have described the American political system as a version of this battle, and one look at the styling at MSNBC and Fox proves them right), and – worst of all – generations of self-hating nerds were born. Nerds who secretly fear that we really are heartless Tin Men, or at the very least not entitled to love or romance – a price that is paid for intellectual gifts.

None of this is true, of course. Thought and feeling are not mutually exclusive, and nerds do have deep emotional lives, even if we can’t always express them in a “normal” way. A lot of progress has been made in the last decade to combat the idea of the awkward, emotionless nerd; as a group, we are learning how to dress and express ourselves, and celebrities like Tina Fey and Chris Hardwick have done wonders for making intelligence sexy. But for every emotionally vibrant nerd like The Big Bang Theory’s Leonard, there is still a caricature like Sheldon for mocking. Have we really gotten to the point where nerds are cool, or is society’s embrace just a new way of laughing at the “weird kids”?

Which brings us to hipsters, who are fake nerds wearing the cloak of uncoolness to avoid becoming actually uncool. Hipsters tend to work in creative professions, which puts a lot of pressure on them to keep their finger on the pulse of what is cool. Since that is basically impossible even for a high-school cheerleader, they defend against it by embracing the trappings of the least trendy character – the nerd – pretending to be so uncool that they can never be actually uncool.

But there is a big difference between quirk and intellectualism – one exemplified by the two title characters played by the Deschanel sisters, Zooey and Emily. The New Girl is beloved by our society; Bones is an awkward genius learning to be “more human” (and the one I love). Quirk is styling your hair and clothes like Einstein; Nerdity is actually reading books about physics and cherishing a 20-year-old teddy bear named Albeart who sports an “E=MC2” T-shirt.

Hipsters are traditionally cool people trying to appear uncool in order to preemptively ward off any challenge to their coolness. Some are actual nerds who have embraced the fake-nerd culture to be a more attractive imitation of their former selves and thus fit in, but both are putting on an act. They are also today’s taste makers, and this is the root my discomfort with the new Geek Chic world order.

The idea of “cool” is rooted in being “normal” (whatever that means); for generations, the one and only source of nerd pride stemmed from the idea that we were at least “special”. Are we really entering an age of enlightenment where different is normal and unique gifts can be celebrated without weaknesses being mocked? That would be nice. My fear, though, is that the cool kids are simply redefining normal one more time; that we are simply on the brink of some new group becoming the epitome of “uncool”.

Will it be science deniers? Meat eaters? I hope it’s liars. That would at least appeal to my hyper-literal, rule-bound brain.

I Was Told There Would Be No Math at This Debate (Statistics Part II)

Standard

Odds are, someone at some point has quoted Einstein’s definition of insanity to you: “Doing the same thing over and over again and expecting different results.” I love this quote for several reasons, the top two being that there is no evidence Einstein ever said it and it is not what insanity actually is. Yet somehow, by people saying it over and over in hope that it is true, it has become true in our conventional wisdom. Isn’t that the kind of paradox that is supposed to rip a hole in space-time and make the universe eat itself?

The dictionary definition of insanity is being deranged or unsound of mind enough to be divorced from reality and thus responsibility. Which nicely demonstrates the root of the problem with our current public discourse – it is Einstein Nutters versus Webster Loons.

Someone needs to impose some Nurse Ratched-level tough love on the world, so here I am, with math.

Back in January, I applied Bayesian reasoning (probabilistic thinking) to relationships, in order to get a better understanding of how illogical I often am in matters of the heart. It was fun! And I started to notice something quite interesting about the math, something that has been increasingly relevant as the clash between Science and Faith has escalated.

To recap: Bayesian reasoning is a process that involves estimating the likelihood of things, then reassessing that likelihood with each new piece of information. In short – and I know this is a dirty word – Bayesian thinkers evolve their ideas over time, getting ever closer to understanding.

If this sounds familiar, it is because probabilistic thinking is how we learn. Hey, that thing on the stove is shiny and pretty – I bet it will feel good too. Ouch. Nope. That did not feel pretty. Maybe pretty things don’t always feel good. Hey, that tiger over there is really beautiful. Maybe this time… and so on. Eventually, we get a feel for the odds (or die).

The formula representing this process – Bayes’ Theorem – is simply a mathematical expression of logic at work. It centers on three variables: our original level of certainty about something (x), the probability of this new info (Ouch) if that something is true (y), and the probability of this new info if that something is not true (z). That’s it!

When we reassess in the face of new information, we simply multiply our original level of certainly by the probability that our theory is still true (xy), then divide that by all possibilities – our original certainly (x) times the probability of true (y) PLUS our original uncertainty (1-x) times the probability of the theory being false (z). In math, that reads: (xy) / [(xy) + (1-x)z]

That’s the worst of it, I promise. What I find most interesting is how the impact of new information changes the more confident we are at the outset. Let me demonstrate with an example involving something totally uncontroversial right now: birth control.

Take two people, one who is super confident that I am a good little girl who keeps her knees closed (we’ll call him “My Dad”), and another who is willing to bet the farm that I am a total slut (“Rush Limbaugh”). They are both men, because involving a woman in a conversation about birth control would be ridiculous. Now, what happens to their respective outlooks when we introduce a new piece of information into their worlds: my use of birth control?

My Dad starts out with only 10% concern that I am a floozy (x=.1), while Rush is 90% sure I am sex crazed, since I am unmarried (x=.9). Variable y is the probability that I would use birth control if I am indeed a slut, which is clearly about 95% – who else would use birth control? Variable z is the probability that I would use it if I am not. Since women are either virgins or whores, that’s maybe 5%.

When we plug those probabilities into the formula, we see that My Dad, who is faced with contradictory information, skyrockets to a new 68% certainty that I am a daughter of questionable morals. As for Rush, he goes from 90% sure to 99% sure I am easy. Had we presented them with opposite information – like a purity ring on my finger – My Dad’s fears of parental failure would have dropped from 10% to .6%, and Rush would suddenly have to grapple with a mere 32% chance of my nymphomania.

Of course, my probabilities here are extreme, but the formula holds. The more confident we are in a theory at the outset, the more devastating contrary information becomes. As it should be! If we truly think an outcome is “inconceivable” and then it happens, we either have to admit that we were very likely wrong, or accept that the word “inconceivable” does not mean what we think it means.

But a funny thing happens when confidence becomes absolute certainty: new information loses all impact. When x=1 (we are 100% sure of something), the formula reduces to y/(y+0z), which equals 1 no matter what y and z are. When x=0 (“Inconceivable!”), the fraction becomes 0/(0+z), which is always zero.

In other words, there is no amount of evidence, experience, or new information that will change the mind of someone who has absolute certainty. Proving once and for all with math that there is no arguing with believers. (Or Beliebers – ugh.)

If you got this far, you are probably tired, because math is hard. Not in the sense that it requires a Y-chromosome (I’m looking at you, Larry Summers), but in the sense of hard work. Math is work; logic is work; being open minded requires the effort of reassessment. Faith, on the other hand, is easy. Not real faith, as defined in the dictionary (“belief in something for which there is no proof”), but the Faith demonstrated too often these days: belief despite all evidence of any kind.

You want the kicker? Thomas Bayes, from whom Bayesian reasoning gets its name, was an 18th-Century minister. I think it’s time for the universe to eat itself now.