Category Archives: Science

Mundane Monday: Assembling the Bones

This week’s Mundane Monday theme is “bones.” For this theme, Dr. KO has a nice picture of an alligator snapping turtle skeleton, which you should definitely check out (also, raise your hand if you knew before this that there was a such thing as an “alligator snapping turtle.” My hand is not up). At first I was concerned I wouldn’t have anything for this week, and then I remembered. This is a great lead-in to the topic that’s been on my mind recently: Back to School!

I have not had much time for blogging lately because my teaching job has started up again. I teach middle-grade science with a non-profit organization called Science from Scientists. Regular school, including my kids’ high school and college, started already a while ago, but SciSci gives the teachers and students a week or two to get settled in before we start our visits. This year I have two completely new schools and two new co-Instructors to work with. I’m also going back to two schools I worked with last year and liked.

One of our most popular lessons at SciSci includes the dissection of an owl pellet. Most of my prior exposure to owls has been via children’s fiction, and I first learned about the pellets as an adult when I taught this lesson.

53-49302-mister-rogers-with-owl-and-cat-puppets-1528306507
Mr Rogers’ neighborhood had an owl in it!

Owls don’t poop. They eat their prey whole and digest it in the gizzard, a second stomach. They absorb nutrients like we do from the soft parts of the prey, and eventually regurgitate a pellet containing the indigestible bits (mainly fur and/or feathers and bones). This is a YouTube video of the process that we show our students. It was taken in Ontario in 2014:

Will you ever look at Hedwig quite the same way again?

harry_potter_and_hedwig

One of the many cool things about owl pellets is that you can dissect them and find out what the owl has eaten. If you buy the pellets from Carolina Biological Supply like we do, you usually find a mouse skeleton, maybe two, in the pellet. Occasionally you might find a bird. The prey animals can be identified by the skulls, and sometimes a large portion of the skeleton can be reconstructed.

AssemblingBones
Students dissecting an owl pellet and reconstructing a skeleton

These bones are quite small, but very interesting!

BirdSkeleton

 

Advertisements

Amazing New Brain Map of Every Synapse Points to the Roots of Thinking

Dave Wolf is on a roll this month with posting cool neuroscience articles! This study, from a team at the University of Edinburgh, is a real tour de force that takes advantage of modern technology in neuroscience. I can imagine that some day that we will be able to map the human synaptome, non-invasively, and in real time. Then we would finally have the tools to address more fruitfully questions such as the nature/nurture debate, how mindfulness changes the brain, even the nature of consciousness and transcendent experiences. Check out author Shelly Fan’s other work too!

author david wolf

By Shelly Fan – Aug 14, 2018*

Imagine a map of every single star in an entire galaxy. A map so detailed that it lays out what each star looks like, what they’re made of, and how each star is connected to another through the grand physical laws of the cosmos.

While we don’t yet have such an astronomical map of the heavens, thanks to a momentous study published last week in Neuron, there is now one for the brain.

If every neuron were a galaxy, then synapses—small structures dotted along the serpentine extensions of neurons—are its stars. In a technical tour-de-force, a team from the University of Edinburgh in the UK constructed the first detailed map of every single synapse in the mouse brain.

Using genetically modified mice, the team literally made each synapse light up under fluorescent light throughout the brain like the starry night. And similar to…

View original post 1,306 more words

Mundane Monday: Skies

Dr. K Ottaway, host of the Mundane Monday challenge, asks us to post this week about skies. But not blue skies, “skies that worry me” (although if you would like to see a really blue sky, check out this post over the White Cliffs of Dover). She posted a fire sky, which inspired me to post one too.

Sunset over Los Angeles made more colorful by haze from California wildfires
Sunset over Los Angeles made more colorful by haze from California wildfires

This picture was taken last year, 2017, from Griffith Observatory in Los Angeles. As with Dr. KO’s picture, the smoke from the fires makes the sunset eerily beautiful.

03BloodRed
Blood moon eclipse

The sky has often been a source of worry. In the past, people looked to the heavens for signs. Eclipses, especially, were thought to portend major events.

We’ve had a few of these in the past year. I hope that instead of the primal response that such events have historically evoked, we humans can use them as signals to come together instead.

 How to Help Victims of The California Wildfires, Refinery 29, by Alejandra Salazar, August 11, 2018

 

Brain discovery could block aging’s terrible toll on the mind

My first instinct on reading this is that the findings are a bit over-hyped. A cure for Alzheimer’s? Really? Of course everyone and their brother wants to prevent brain aging and disease and everyone and their brother has something to sell you to do that–computer programs, supplements, oils, or the next Silicon Valley biotech startup IPO. But as a neuroscientist, I think that even if it is over-hyped and not a panacea it is still very worth pursuing medically. It’s a fresh approach and is likely to be an important missing piece of the puzzle. The brain is a hard organ to study, and neuroscience research has progressed slowly–and not for want of trying!

author david wolf

Faulty brain plumbing to blame in Alzheimer’s, age-related memory loss — and can be fixed

NOTE: this article not written by David T. Wolf, but selected by him from Science Digest

Date:July 26, 2018

Source:University of Virginia Health System

Summary:  Aging vessels connecting the brain and the immune system play critical roles in both Alzheimer’s disease and the decline in cognitive ability that comes with time, new research reveals. By improving the function of the lymphatic vessels, scientists have dramatically enhanced aged mice’s ability to learn and improved their memories. The work may provide doctors an entirely new path to treat or prevent Alzheimer’s disease, age-related memory loss and other neurodegenerative diseases.

Obstructing lymphatic vessels (in green) in a mouse model of Alzheimer’s disease significantly increased the accumulation of harmful plaques in the brain. “What was really interesting is that with the worsening pathology, it actually looks very similar to…

View original post 956 more words

Film Review: Fallen Kingdom, The New Modern Prometheus

This year, 2018, marks the 200th anniversary of the publication of Frankenstein; or, The Modern Prometheus. The classic novel was written as a parlor game in 1816, the year without a summer, by the teenage Mary Wollstonecraft Godwin (later Mary Shelley).

FrankensteinCover
Wordsworth Classic Edition

The Governing Myth

FrankensteinHRWEdition
HRW Edition

Her story of Dr. Victor Frankenstein and his monster casts a long shadow. Its adaptations in popular culture have become what author Jon Turney calls “the governing myth of modern biology“: a cautionary tale of overreaching by a scientist that ends in tragedy and death.

A central tenet of this myth is that the monster, cobbled together from dead body parts and animated by electricity–created by man not God–is against the natural order of life. The story’s horror comes not just from fear of dying at the monster’s hand, but from a more primal sense that the universe itself will not abide this creation or those that created it. In the words of Ian Malcom, “Life finds a way.”

The Myth Updated

attenborough_miracle_on_34th_st
Richard Attenborough in Miracle on 34th Street. Image: Jurassic Park wiki

The original Jurassic Park updated this governing myth for the 20th century. Instead of electricity and magnetism, the sexy new science to be harnessed is genetic engineering. And instead of a humanoid monster killing everything its creator loves, we have dinosaurs. But the same hubris, greed, and willful ignorance remain–along with the same sense of wonder and naive good intentions.

In a humanizing scene, John Hammond (played by Santa Claus from the remake of Miracle on 34th Street) explains to Dr. Sattler that he wanted to bring magic to children with Jurassic park, just as he did years ago with a flea circus. But his park is collapsing, melting all around him, along with his ice cream and his dreams.

jurassic-park2
Jurassic Park Ice Cream. Image: Popcorn Cowboy

John Hammond, like Dr. Frankenstein before him, gets his comeuppance by the end of the book (although it takes a few movies to finish him off). And, the other craven greedy villains such as Donald Gennaro and Dennis Nedry become dinosaur food quickly and spectacularly. While of course the cute kids, Hammond’s grandchildren, survive.

Leaving the island of the dinosaurs after his misadventures, protagonist and good guy Dr. Alan Grant looks out of the helicopter to see modern pelicans flying as they should. His theory of dinosaur-to-bird evolution has been validated. His skepticism about Jurassic park itself has been vindicated, too, at great cost, and all is back in balance.

giphy
Image: GIPHY

Life found a way to put humans in their place.

Science, not Myth

The original Jurassic Park had unforgettable characters, amazing effects, an awesome music score, and was thematically resonant with Frankenstein, a timeless classic of English literature.

1-extraction-mr-dna
Image: Jurassic Park wiki

Jurassic Park also, almost unique among modern science fiction movies, contained a testable scientific hypothesis. The story spawned a virtual cottage industry of scientists looking for ancient DNA in amber until the half-life of DNA molecules was calculated several years ago. These results showed definitively that Jurassic-era DNA could not have survived long enough to be reconstructed to clone dinosaurs. Real-life Henry Wu wannabees will have to make do with trying to bring back animals more recently extinct.

The Myth Transformed

Jurassic World: Fallen Kingdom is like Jurassic Park‘s ugly stepsister, a monster cobbled together crooked from all the shiny parts of the original. Its dinosaurs are bigger, badder, and uglier. The heroes are hiding out in remote cabins and ineffective non-profit organizations. The benevolent-ish grandpa, this time named Ben Lockwood, isn’t Santa Claus. He’s an invalid who is being taken advantage of by his underlings.

And the cute grandchild, Maisie? There’s something otherworldly about her too. She lives by herself, except for an elderly governess, in a creaky old mansion above a museum, and looks and talks like English musical child prodigy Alma Deutscher.

Most of the plot of Fallen Kingdom will surprise no one. People stand there, mouths open, until they get lunched by dinosaurs. Greed and hubris are again on display in ever-uglier forms. A plucky child escapes death by dinosaur and makes a fateful decision. The audience will probably cheer when a particularly horrible example of humanity tries to take a trophy from a dinosaur he thinks is asleep and then loses his arm, and his life, in the process.

What is different in Fallen Kingdom is that while the body counts pile up, balance is no longer restored to the movie’s universe. The otherworldly Maisie turns out not to be Ben Lockwood’s granddaughter at all, but a clone of his deceased daughter. The flying creatures Owen sees at the end of Fallen Kingdom aren’t Dr. Grant’s friendly pelicans. They’re pteranodons.

pteranodon04
Image: Jurassic World Universe

In their race to save and weaponize the most clever and aggressive dinosaurs, the humans of Jurassic World: Fallen Kingdom abandon another of their creations, the gentle plant-eating brachiosaurus. This is the same species that first evoked awe and wonder in the original Jurassic Park. The scene where a brachiosaurus calls to the retreating human ship as it awaits its own death on the island has become an audience tear-jerker. “That scene represents the ending of a dream that started 25 years ago,” says director J.A. Bayona.

I think this scene represents a new reading of the Frankenstein myth for modern scientists. Frankenstein’s creation does not start out cruel and murderous. He only becomes that way when he is abandoned by his creator. Hank van den Belt, a Dutch professor of philosophy, writes in Science magazine that Dr. Frankenstein’s greatest moral shortcoming was that he did not assume responsibility for his own creature and failed to give him the care he needed. Owen’s conscience is similarly pricked when he realizes how he may have failed to give Blue the care she needed.

Modern audiences for Frankenstein sometimes confuse the name of the scientist with the name of the monster. This confusion mirrors the increasingly monstrous behavior of the scientist. Dr. MG Bishop of King’s College Hospital in London is quoted in the same issue of Science:

Read the book and weep for those we have rejected, and fear for what revenge they will exact, but shed no tears for Frankenstein. Those who think, in ignorance of the book, that his is the name of the Monster are in reality more correct than not.

SadBrachiosaurus
A sad brachiosaurus awaits its death on Isla Nublar as the last rescue boat leaves

In Fallen Kingdom, life as we know it no longer finds a way back. Instead, the worst impulses of human nature have found a way to transform nature itself.

This review also appears in slightly edited form on Movie Babble

 

Mundane Monday: Fingerprints

This week’s Mundane Monday theme #164 is “A Use for Hands.” I am posting it on Tuesday because of European hotel wifi bandwidth failure.

Last week my hands were used in a fingerprint forensics STEM outreach activity.

There are 3 classes of fingerprints: arch, loop, and whorl. Arch is the least common, with only 5% of fingers in the USA exhibiting an arch print. My right index finger happens to have a good arch.

IMG_3492

So, I made a bunch of examples.

IMG_3493

They were used in an outreach activity at a STEM festival last weekend. Kids who came to the booth had to figure out who stole the candy, based on fingerprint, hair, and cryptology evidence.

Here’s one of the suspects. (My hair is not really pink: it’s an app!)

IMG_3494Mug shot of the Strawberry Snatcher

Memories of Memorizing

When I said I had decided to perform the Telemann viola concerto from memory I was met with some skepticism.

2ndmovementaskew.jpg“You don’t *have* to, you know.”

“I don’t think I could do that.”

“A lot of soloists nowadays are using the sheet music.”

“I’d want the sheet music there just as a security blanket.”

There’s a lot of overlap between shared experience and advice. It’s a general human tendency to believe that the lessons of one’s own experience are relevant for others too. But, as I’ve learned (from—ha—experience), it’s better to let the recipient decide how and why that is true. This blog is intended in that spirit.

In my case, I need to memorize.

In my day job, I am a neuroscientist. I worked for several years in biotech, then in academia as a project manager, and now in STEM education and outreach. I could go on, comparing different aspects of scientific and musical careers, but for now, this concerto performance is taking me back to my PhD thesis defense. At Stanford where I was a student, as at other major research universities, PhD candidates have to write a thesis, present their work in a departmental seminar, and then answer questions from their committee, which comprises several professors in the student’s field of research.

IMG_5016
Cultured neurons, from a figure in my PhD thesis

My thesis committee members were intelligent and kind, and my thesis consisted largely of putting together three already-published papers and two manuscripts in preparation. I didn’t expect to fail based on my scientific work. But I did have these nagging thoughts that I could fail based on my presentation of that work. I had a history of performance anxiety and self-sabotage. There were the points lost from school reports because I read them verbatim from note cards. And the speech I gave for my failed run for student council. An All-State audition in which Mozart’s Violin Concerto #5 reduced me to tears wasn’t any better. And then came the worst one of all: the disastrous audition for the University Orchestra my freshman year in college that started me down the road to quitting the violin.

But there was a glimmer of hope in grad school, and it lay in the results of memorization. A few years before my thesis defense, I gave my first talk at a major scientific meeting, the Society for Neuroscience meeting in Phoenix AZ. My 10-minute talk was scheduled, along with two others from my lab, in a session starting at 9 am on Monday morning. The night before, I paced an empty hotel conference room, memorizing my talk word for word. One of my lab-mates had suggested I do this. She was older than I, a postdoc and a rising star in the field, known for giving good talks. And she let me in on a secret: she still got nervous. Like, really, really nervous. But these talks were only 10 minutes, short enough to memorize, and that helped her. It might help me too.

CajalHippocampus
Cajal’s drawing of the hippocampus, a brain structure important for memory. Public Domain, https://commons.wikimedia.org/w/index.php?curid=612536

I had about 10 slides and so first I memorized the order of the slides, then I chose a visual cue on each that would remind me of the slide to come. When I changed to the next slide I oriented the audience to what they were seeing and then gave the slide’s important message. Then it was time for the transition to the next one. This mental map of order of slides/visual cues/transitions/important message was something for me to hang onto and think about, even as the storms of anxiety raged.

The next morning busses from the hotels were crowded and we almost didn’t make it to the convention center in time. With over 25,000 neuroscientists in attendance from all over the world, this conference is so big that only a few convention centers in the country can handle it, and this particular meeting took place before the Society figured out that Phoenix wasn’t one of them.

The logistics were in disarray; attendees were packed into the ballroom like sardines without enough chairs and the podium lights weren’t working properly. My mentor was first from our lab to give her talk. I watched as the podium light went on and off randomly but she continued to speak calmly. The projector functioned, but there was no pointer available, laser or otherwise, and as she stepped back to the screen to point at something on one of her slides, she disappeared entirely. In the dark, she had missed the edge of the podium and fallen off. The audience gasped. She re-emerged, uninjured, climbed back up and finished her talk. Her voice shook but she got it under control. The podium lights came back on sometime near the end. The timing bell rang, people asked questions.

And then I was next. I took the stage wondering what fresh hell awaited.

My own talk went off without incident. The lights, and the laser pointer, and everything else were up and running by then thanks to the hardworking convention staff. I was hyper-aware of where the edge of the podium was. I knew my talk well. I’d just witnessed one of the worst things that could possibly happen during a talk, and I knew it was survivable. My friend’s preparation, the fact that she knew her talk backwards and forwards, had made the difference.

Several years later, when I was giving my thesis seminar, I had this experience to think back to. My seminar was about 5 times longer than the little 10-minute meeting talk, but I still approached it the same way: slides/visual cues/transitions/important messages. I just had more slides. I ran through them mentally, over and over again. The order was comforting; it was the stick I gave the trunk of my elephant brain to hold onto.

I passed.

Concertos don’t use slides or projectors to deliver their message, which is different from a scientific talk. But certain principles still hold true. First of all, having note cards, prompts, or the sheet music “just in case” isn’t going to work for me. If I know it’s available I’ll lean on it. I’ll steal a look and then start reading it verbatim. Instead I need to be prepared to look inward, not outward, even–or perhaps especially–for that cue to keep going when I stumble.

antoine-pluss-553188-unsplash
Elephants and their trunks. Photo by Antoine Plüss on Unsplash

Of central importance is something that Meditation Instructor Eknath Easwaran called the stick for the elephant trunk.

The human mind is rather like the trunk of an elephant. It never rests. It goes here, there, ceaselessly moving through sensations, images, thoughts, hopes, regrets, impulses. Occasionally it does solve a problem or make necessary plans, but most of the time it wanders at large, simply because we do not know how to keep it quiet or profitably engaged.

Easwaran goes on to recommend the mantram, a spiritual formula in the form of a word or short phrase, to steady the mind. This is a subject of study for a lifetime. And I am not naturally a great meditator; sometimes when I try, it puts me to sleep. Furthermore, I find words themselves to be an awkward fit for a steadying mental substrate.

My mind gravitates more towards deeper non-verbal sensory experiences: pictures, kinesthetic feelings, and music. It is those sensations that I string together as another kind of mantra. Not power point slides this time, but bridges, ladders, and lattices. Finger patterns, and arpeggios climbing to the sky before sliding back down the other side of the bow. The deep purple of the C, the forest green of the G, as I put bow to string.

Brain Coral. Photo by Daniel Hjalmarsson on Unsplash
Brain Coral. Photo by Daniel Hjalmarsson on Unsplash

Happy DNA Day!

Today, April 25, is National DNA Day. This is an obscure holiday, ranking somewhere above International Violin Day but probably below the recent pi day, or even the upcoming May the 4th be with you.

It celebrates something important, though. National DNA Day commemorates the successful completion of the Human Genome Project in 2003 and the discovery of DNA’s double helix by James Watson and Francis Crick in 1953.

DNA stands for DeoxyriboNucleic Acid. The “nucleic” part means it is found in the nucleus of all cells. There are many different ways to visualize DNA. These are some of the most common:

Francis Crick's DNA model, Rosalind Franklin's X-ray crystallography, a schematic double helix, a bacterial chromosome, some DNA isolated in the lab
Different views of DNA

The upper left shows Watson and Crick’s original wire model of the DNA double helix. On the lower left, the abstract X is one of Rosalind Franklin’s X-ray crystallograms that were used to help determine the structure. The upper right, looking like a beaded necklace, is a bacterial chromosome viewed under an electron microscope. And the lower right, with what looks like some cloudy snot in a tube, shows some DNA extracted from strawberries with common kitchen materials.

The center picture shows the DNA double helical model that has entered popular culture. A DNA molecule is composed of two complementary strands that wind around each other in a helical shape, a “twisted ladder.”  The rungs of the ladder, in primary colors, are the base pairs, which spell out the genetic code.

This song from “They Might be Giants” explains in 2 and a half catchy minutes how the DNA instructions are translated into making different cell types.

Every year the American Society for Human Genetics sponsors an essay contest for high school students and announces the winners on DNA day. This year’s question was about genetic testing.

dnaday2018_logo_final

Do you think medical professionals should be required for all genetic testing, or should consumers have direct access to predictive genetic testing? 

Check out some of the winning essays here on their website. Congratulations to all who entered!

Film Review: A Wrinkle in Time

The book by Madeleine L’Engle on which this movie was based was one of my childhood favorites. I looked forward to the film eagerly because I wanted to see a gifted director do justice to the material. I thought that many of the changes were promising updates for modern audiences, able to bring the book’s uplifting message of love to more people.

On an even more personal note, my still-unfinished novel, Hallie’s Cache, was inspired by A Wrinkle in Time. In both stories, a misfit young teen girl looks for her missing father and grows into herself in the process. A Wrinkle in Time was rejected from 26 publishers before going on to win the Newberry Medal and become one of the most beloved children’s books of the 20th century. There has been a previous attempt at making a movie out of this material in 2003, with mixed success. It has always defied categorization: is it for adults or children? Is it fantasy or science fiction? Is it too Christian or not Christian enough?

After watching the current version, I’m not convinced that it’s possible to make a good movie out of this book. The director, Ava Duvernay, did everything right: she assembled a great cast and approached the project with care, respect, and a wide open vision. And I enjoyed it on its own terms; I identified with Meg and her teenage problems. I found Storm Reid to be an appealing and relatable actress. I rooted for her and her friends to save her father. I loved the trippy visuals, the costumes, the animations. I even cried for the brokenness in the world, as gently as it was portrayed, and cheered for the family’s reunion. But it wasn’t the story that packed the emotional punch that I remembered and loved all these years. Opening to mixed reviews and eclipsed at the box office, it is likely to remembered, if at all, as a footnote to Duvernay’s career.

As much as I hate to admit this about a childhood favorite book, the problem is likely not with the filmmakers, but with the source material. Written in 1962, A Wrinkle in Time is of a particular time and place. The book’s characters are all European-Americans, redheads or mousy-brown-haired with Anglo-sounding names like Charles Wallace and Dennys (who, with his twin brother, is wholly absent from this movie). It famously opens with the Bulwer-Lytton cliche: “It was a dark and stormy night,” as Meg watches a thunderstorm from her lonely, cluttered attic room. For these reasons, and because of the three witches, the gossip about Mr Murry’s disappearance, and the neighbor’s sheets drying on the line, I had always pictured it taking place in an eccentric, secretive New England small town–a small town with a dark side like the ones that L’Engle herself, and her contemporary Shirley Jackson, lived in and raised their families.

Bringing this story out into the bright Southern California sunshine as this movie did took too much of the edge off. Certainly there are edgy areas of Southern California too, but we didn’t see those. The Murrys’ home is gorgeous and spacious. The middle school Meg attends is a well-resourced model of ethnic diversity headed by principal of color who is a three-time science teacher of the year award winner. This muddies the rationale for why Meg is bullied by the other students. In the book, Charles Wallace is an odd prodigy, perhaps a savant on the autism spectrum although that was not understood at the time the book was written, and Meg gets in trouble at school for defending him from bullies. She is thereby always his protector, and her actions at the end of the book, when her fierce love saves Charles Wallace from IT’s clutches, are perfectly in character and make emotional sense.

In the movie on the other hand, Charles Wallace is still a prodigy, but he appears quite well tolerated, happy, and self-contained, and he doesn’t need Meg to protect him. If anything, he is the one protecting Meg. Meg’s outcast status is instead attributed to her father’s disappearance. But in present day California, with so many children being raised by single parents and blended families, her father’s disappearance would not be the scandal it was in 1962. Her loneliness can and does lead her to act out further, but it is a feeble justification for her school situation as depicted here.

Details of what happened on Camazotz are also compressed in the movie. The book’s depiction of Camazotz, the planet that has given in to evil, gives off a sort of Kafkaesque bureaucratic banality. Complete conformity to a 1950s suburban nuclear family ideal is expected, and outsiders’ food turns to dust in their mouths. “IT,” the master controller, appears in the book as a disembodied brain on a dais. IT appeals to Charles Wallace and seduces him to ITs side because of ITs ability to control and impose order on messy human impulses. IT was a metaphor for the tyranny of a society that values and runs on brains and intellect alone and disregards love. That in the book there are two battles against IT and that Meg must make the decision on her own to return to Camazotz and rescue Charles Wallace compound the sense of foreboding and dread, as well as making Meg’s triumph sweeter and more meaningful when she does ultimately rescue him.

The movie, however, shows little of Camazotz; the scene with the kids bouncing balls in unison seems like a confusing non-sequitur rather than a Potemkin village masking the fear and desperation of the populace. After her father and Calvin are defeated, Meg is forced immediately into lonely battle with IT. This battle scene was disappointing. Some of the creepy tree-like things with branches might have supposed to have been neurons with dendritic trees, but the overall connection of the movie’s IT to an emotionless, loveless, disembodied brain capable of the ultimate in mind control was weak.

The rescue scene itself focused too much on whether Meg herself was lovable in spite of her faults and not enough on the transforming power of Meg’s sacrificial love for Charles Wallace. The book is unapologetically Christian in outlook, reflecting L’Engle’s own Christian faith and naming Jesus as one of the warriors against the darkness that enveloped Camazotz. I believe that L’Engle intended Meg’s love for Charles Wallace here to be selfless and Christlike, yet her Christian imagery and references have been dropped from the movie, to its detriment. Mrs. Who quotes and references many world religions; Christianity could have been included there. And why not acknowledge Jesus’ role, and the role of faith for many Christians, in fighting evil? True to her character, the scientist Meg would likely remain skeptical, and that would be okay too. Warriors can come from all faith traditions, and from no faith tradition.

The other big problem with this story is the science, the so-called “Wrinkle in Time” itself, known in both the movie and the book as a tesseract. Back in 1962 at the dawning of the Age of Aquarius, when the theory of relativity was new and humans hadn’t yet gone to the moon, the idea that you could bend space-time with your mind by “tuning” it to the right frequency might have been a little more believable than it is now. There is a scene in the movie in which Dr. Murry is shown giving a seminar about how tessering works. Jeers and guffaws of disbelief come from the audience; as well they would in real life. He sounds like a New Age motivational speaker in the tradition of Werner Erhard, or a trickster like Uri Geller.

I still remember when an annoying boy in my physics class explained to me what a tesseract “really” was: a cube within a cube, a projection of 4-dimensional space into 3 dimensions the way the drawing of a cube on paper as a square within a square was a projection of 3-dimensional space onto 2. There was nothing about wormholes or traveling 93 million miles with just your mind. Talk about disappointing! It wouldn’t have taken much to give Dr. Murry in the film a high-tech device that would make tessering possible, or some novel psi powers based on his and his wife’s research. These would have to be hand-wavey and entirely fictional of course, but good shows have been based on less. What doesn’t work is asking us to accept that New Age mumbo-jumbo somehow became true for this family because they “believed in themselves.”

As a smart girl who was interested in science, I believed too long in this book’s oversimplified and inaccurate version of how science works. Meg’s parents worked together in a homemade lab; when her father disappeared, her mother continued her experiments in the kitchen, bunsen burner on one counter, soup on the other. There was no mention of grants, funding, students, safety regulations, collaboration outside the family unit, or even publication. It’s a more romantic and family-friendly vision of doing science than has ever actually existed. Perhaps this vision was inspired by the real-life Curies, French Nobel Laureates Marie and Pierre and their children, who discovered radioactivity; yet their work had a visible dark side. Modern science is safer for its practitioners, and it is more open and collaborative than either what the Curies experienced or L’Engle’s vision. It is also more expensive and more technical, and requires more energy and perseverance than romantic genius for success. A film that wants to inspire young people in science in the 21st century would do well to tell a more accurate story about the scientific process. This film drops that ball completely.

I hope this movie inspires its young audience, but sadly, I don’t believe it’s memorable enough for that.

Already? Why I don’t like Daylight Saving Time anymore

I haven’t blogged yet this month and suddenly it’s mid-March already. Tonight we set the clocks ahead one hour. This time of year brings out a predictable spate of articles about the history of Daylight Saving Time (not “Savings” Time) and partisans on both sides weigh in.

The one thing everyone seems to agree on is that they hate changing the clocks. I used to not mind it as much as I do now, but that was before it was so G-d-awful early in the year. Daylight saving time started in the USA in 1918 with the idea of saving energy. That means it is 100 years old this year. I actually remember the biennial clock ritual as a child with a little fondness. Thinking it through helped me understand clocks, timekeeping, time zones, circadian rhythms, and jet lag a little better. And somehow the stakes were lower for sleeping in.

But we don’t currently observe your grandfather’s Daylight Saving Time. The first federal standards established that DST would start on the last Sunday in April and end the last Sunday in October. But in 2007 and 2008, DST was extended by another month, into March and November. This extension was politically motivated and driven by candy and golf industry lobbying. The hoped-for energy savings have not materialized.

This is when I started to get angry about this issue. Nobody asked us: there was no popular vote on this change, and no consideration for what the time shift does to the human body clock.

anonymous-dstblanket500x250px

Daylight Saving Time doesn’t actually save anything. It just shifts light from the morning, when it is needed to entrain the body’s internal clock, to the evening, when it contributes to insomnia. Especially when instituted so early in the year, before the equinox (before the day is even as long as the night) it puts everyone in a state of perpetual eastward-going jet lag.

The body’s clock never actually adjusts to daylight time. The spring forward clock change is associated with a 25% increase in the risk of heart attacks  and an increase in traffic accidents on the Monday following the time change (Be careful out there!) And it’s in exactly the wrong direction to help sleep-deprived teens be able to get up for school. Daylight Saving Time is one of many factors that disconnect humans from the natural world.

I think we should just end this 100-year-old experiment altogether, and live on standard time all year round. But I’d settle for a return to the standards of the 1960s and early 1970s, when turning the clocks ahead really meant the coming of spring.

ClockGermanTown