Judi Brown Clarke receives Excellence in Diversity Award

We are proud to congratulate BEACON’s Diversity Director Judi Brown Clarke for receiving the Individual Sustained Effort Toward Excellence in Diversity. The Excellence in Diversity Award program recognizes outstanding efforts of faculty, students, and staff at MSU who are committed to the principles of diversity and inclusion and who actively engage in activities demonstrating a sustained commitment to these principles.

The award ceremony will be held at The Kellogg Hotel and Conference Center, Big Ten A, on the MSU campus Monday, February 13, 2017 at 4:00pm.

Posted in BEACON in the News, BEACONites | Tagged | Comments Off on Judi Brown Clarke receives Excellence in Diversity Award

Evolving Evacuation Plans for Urban Areas

This post is written by UI grad student Keith Drew

University of Idaho Evacuation Planning Research Team

The team at University of Idaho currently consists of four people, Dr. Robert B. Heckendorn, Keith Drew, Homaja Marisetty, and Madhav Pandey. Our team also includes researchers at the Michigan State University, led by Dr. Kalyan Deb. Our work is focused on evacuation planning and emergency management, using evolution strategies to evolve traffic assignments, or probability-based instructions, for intersections in urban areas. The problem we want to solve regarding evacuation is the issue of traffic congestion, while also being able to adapt to changes in the environment. Our goal is to evolve real-time traffic distributions in the face of disasters and other evacuation events, however, the group at Michigan State University is seeking to solve the same problem using a different approach. So far, we at the University of Idaho have created an implementation of our model and began running experiments.

I am a graduate student in Computer Science at the University of Idaho and I also completed my undergraduate here as well. I was introduced to the project by Dr. Heckendorn, and have found it to be engaging work so far. Personally, I am interested in this work because I feel it is important. A real-world application, leveraging evolution strategies, appeals to me in two significant ways. First, the work could lead to a life-saving evacuation management system, which appeals to my practical side and second, working with evolution heuristics is always compelling for the sake of watching solutions to difficult problems form.

Our research goal is to find a way to provide real-time solutions to evacuation problems, which are constrained by congestion, time, and dynamic events. In the past, evacuations have led to large congestion problems in the traffic systems being evacuated. Our work seeks to provide traffic distributions that tell vehicles which way to go at intersections, adapt to changes in the environment, and optimize routing for the safety of the population being evacuated. So far, our specific approach is untested and looks promising. Our current approach is to evolve probability distributions for each intersection throughout the area being evacuated. These probabilities serve to route traffic by breaking the traffic assignment up and sending the appropriate amount each way at an intersection. We judge each set of such probabilities based on how safe every member in the population is at a given time. For example, if a hurricane will move over the evacuation area in one hour, we simulate one hour and evaluate safety at that point. Evacuation plans vary by the type of event calling for an evacuation, such as a flood or a hurricane. In some cases, elevation is a key component of safety, in other cases distance is the key focus; by evaluating the safety of the population we hope to find non-intuitive solutions for any type of evacuation event.

Another aspect driving our research is the dynamic nature of real disaster and evacuation events. Our model focuses on these changes as constraints. For example, we want to be able to handle a change in the availability of certain routes, during a disaster. Such a change might be a bridge washing out or collapsing, or perhaps a road becoming partially or fully blocked due to a traffic accident. Another type of change involves the way that people behave while driving. If people are told to evacuate a certain way, yet they choose to follow their own instincts, issues of congestion can arise, and our model needs to be able to handle those types of changes. Another more obvious type of change is the disaster or danger that people are evacuating from. Consider a hurricane, which moves along. Over time, safe areas become unsafe, and unsafe areas become safe once again. In the face of such changes evacuation plans can be undone, and we want to provide a solution that can adapt to any such change.

An example of our model output

Once an algorithm or model is created that works well and can create robust traffic assignments for large areas, communicating that information to the evacuees becomes a new challenge. Methods for communicating such information include possibilities such as vehicle-to-vehicle communications, self-driving cars, and more. Self-driving cars are of particular interest, as studies have shown that automation can improve traffic conditions significantly, and some work has shown that self-driving cars can work together to navigate intersections without stopping. They simply speed up or slow down on approach to the intersection. With such technology, our model might be able to provide the directions needed to such cars.
My contribution to the project so far has been helping to develop the model we are using, as well as implementing it, using a combination of a lexical analyzer, parser, and C++. Currently our system includes a simulation, for running traffic through a graph that represents the area being evacuated, an evolution strategy algorithm, which evolves the traffic distributions, and a grammar which is used to specify evacuation areas and tests for our model. Another key component created by Homaja, is a visualization tool, written in Processing. The tool takes output from our model and creates visualizations of traffic moving through the grid, as well as creating graphs that allow us to ensure the model is working as intended. It’s very nice to be able to watch vehicles traverse our little city, and escape some unseen danger.

In the end, we are leveraging knowledge and experience from the realms of evolutionary computation, emergency management, civil engineering, and traffic analysis and behavior, all to create a system which can, ideally, minimize any loss of life during large emergency events. So far, preliminary results look very promising, and I am definitely enjoying this work.

Posted in BEACON Researchers at Work | Tagged , , | Comments Off on Evolving Evacuation Plans for Urban Areas

Charles Ofria named one of the 2017 William J. Beal Outstanding Faculty

Charles Ofria, 2017 William J. Beal Outstanding Faculty Award Recipient

Charles Ofria, 2017 William J. Beal Outstanding Faculty Award Recipient

We are proud to congratulate BEACON’s Deputy Director Charles Ofria for being named a 2017 William J. Beal Outstanding Faculty Award winner.

Charles Ofria is recognized internationally for his research at the interface of computer science and evolutionary biology. He developed the Avida Digital Evolution Research Platform, wherein self-replicating computer programs are subject to mutations and selective pressures resulting in an open-ended evolutionary process. Because these digital organisms exist inside a computer, Ofria can easily study long-term evolutionary processes and, in turn, apply what he learns toward solving computational problems

Ofria is one of the founders of the $50 million BEACON Center for the Study of Evolution in Action at MSU, an NSF-supported center that allows engineers with an applied evolutionary focus to work with evolutionary biologists to create a theoretical foundation for both computational and biological research.

As part of his role in founding the BEACON Center for the Study of Evolution in Action, Ofria developed a multidisciplinary course on Multidisciplinary Research Methods for the Study of Evolution, for the purpose of mentoring students on the research process at the intersection of fields. He covers communicating across fields, developing interesting research questions, performing literature searches, formulating and testing hypotheses (along with using simple statistics) and presenting research results so they are accessible to a range of audiences.

Each year, multiple groups in the class publish their work after the end of the semester, with students universally agreeing that they have gained tremendous insight into the research process. Teaching reviews from other classes note his ability to convey abstract concepts in a practical manner, often through live-coding demonstrations, real-world examples and instructional technologies that provide students with instantaneous feedback.

His teaching has been recognized with the MSU Teacher–Scholar and the Withrow Teaching Awards.

Ofria is the president of the International Society for Artificial Life and a member of the editorial board for “PeerJ Computer Science.” He is active in reviewing articles for a number of prestigious journals, including “Nature,” “Communications of the ACM: IEEE Proceedings of Artificial Intelligence” and “Journal of Theoretical Biology.”

MSU will celebrate the accomplishments of 10 William J. Beal Outstanding Faculty Award winners and other recipients of all-university awards during the Michigan State University’s annual Awards Convocation on Tuesday, Feb. 7 at 3:30 p.m. at Wharton Center’s Pasant Theatre. Two faculty from the College of Engineering — Bradley Marks and Charles Ofria — will be honored. The 2017 honorees bring the number of MSU faculty honored to 541 since the award was established in 1952.

Colleagues, friends and family are invited to share the event with the awardees. MSU President Lou Anna K. Simon will salute their contributions to the university’s excellence. Simon also will take a few minutes to acknowledge MSU’s Founders Day as well as deliver the 2017 State of the University address.

Watch the celebration at WKAR.org
There will be a link to the live stream of the celebration at wkar.org and www.ahr.msu.edu/all-university-awards. The William J. Beal Outstanding Faculty Awards are supported by the Office of University Development.

William James Beal (March 11, 1833 – May 12, 1924) was an American botanist, who was professor of botany (1871-1910) and curator of the museum (1882-1903) at the Michigan Agricultural College (MAC), now MSU. He was a pioneer in the development of hybrid corn and the founder of MSU’s renowned W. J. Beal Botanical Garden.

Related Website:
Story courtesy of MSUToday

Posted in BEACON in the News, BEACON Researchers at Work, BEACONites, Member Announcements | Tagged , , , | Comments Off on Charles Ofria named one of the 2017 William J. Beal Outstanding Faculty

Indigenous Evolutionary Knowledge Survey

This post is written by MSU postdoc Wendy Smythe

For the purpose of this survey we are asking Native American, Alaska Native, Pacific Islanders, and Hispanic individuals to take this 15-minute survey as a means to give voice to the views and opinions of Indigenous people.

The goal is to take an active role in helping bridge the gap of knowledge between non-Native scientists and Indigenous communities by utilizing traditional knowledge to more effectively employ science instruction. Indigenous people possess Traditional Ecological Knowledge (TEK) that has led to the sustainability of many ecological resources—where these resources are responsible for the sustainability of these communities since time immemorial. However, the emerging climate crisis and increased anthropogenic activities have begun to threaten and deplete these resources. To address these concerns, the science, technology, engineering, and math (STEM) fields have begun to collaborate with NA/AN communities in an effort to monitor, manage, and protect natural resources.

Currently there is a disconnect between the STEM fields and TEK with regard to ways of knowing and how to ethically and respectfully use TEK. A disconnect that is often reflected in science instruction, where few educators completely understanding Indigenous worldviews and their effects on student learning and world view. Kimmerer provides a more descriptive definition of TEK, where he states that TEK is;

“knowledge, practice, and belief concerning the relationships of living beings to one another and to the physical environment, held by peoples in relatively nontechnical societies with a direct dependence upon local resources…it is born of long intimacy and attentiveness to a homeland and can arise wherever people are materially and spiritually integrated within their landscape. TEK is rational and reliable knowledge that has been developed through generations of intimate contact by native peoples with their land”.

Below is a link to a survey of Indigenous Knowledge in order to collect data on opinions in Indian country about TEK/STEM and Evolution. We are trying to get 300 responses and the first 50 will get a $10 Amazon gift card, followed by a random drawing of 25 for a $5 Amazon gift card. We want any age group, all education levels, and anyone Indigenous (Native American, Alaska Native, Mexican, Pacific Islander).

The survey takes about 12 minutes to complete (and must be completed to get the Amazon gift card).

The survey will be open until February 16, 2017

The best way to effect change of Indigenous Education is by listening to the voices of the people.

Here is the link.

Posted in BEACON Researchers at Work | Tagged , , | Comments Off on Indigenous Evolutionary Knowledge Survey

Studying the Evolutionary Dynamics of Emergent Phenotypes

This post is written by MSU faculty Mark Reimers and Arend Hintze

Let us marvel about the complexity of life for a moment. We have DNA transcribed into mRNA, just to get that translated into proteins, which metabolize, catabolize, or process many other molecules and are responsible for the form and function of each cell. But it doesn’t end there, cells form aggregates and make tissues, which make organs, which form organisms, which are controlled by a complex neural network, just so that we can have social interactions and form communities. And again it doesn’t end there. Each of these steps, in themselves dependent on all other layers of interactions. You see an attractive mate, your brain processes this information, it secretes hormones, which trigger signal transduction cascades, which express genes, regulate other functions, and as usual, the story told much later was: “Well, one thing led to each other, and Mommy and Daddy fell in love, and then you were born!” – evolution in action!

But it also makes us wonder if our computational models keep up with this nested complexity. In pretty much every system we use to study evolution, we have some form of genome that gets translated into something who’s performance we evaluate, and maybe we have agents interacting, but these at most three steps of modeling is far from the apparent complexity of nature – but does it matter?

We think it is more than fair to say that we learned very much about evolutionary processes and evolution in general from using computational models, and quite frankly, that was possible because they were simple. Every time you add another complication to your model, you potentially open a can of worms, and you need to control for yet another factor. Simplicity is the key to successful research.

Most simulation studies at BEACON assume a one-to-one correspondence between ‘genes’ and traits. This strategy makes sense for simulating evolution of bacteria, whose business is biochemistry, and where many phenotypes depend on one gene (or one operon); the E coli studies and single objective evolutionary algorithms were the two key inspirations for BEACON. However we argue that this approach is insufficient for studying metazoan evolution, because animals are constructed through interaction of many components specified by genes. Each trait or body feature is then affected by many genes; and most genes affect several distinct traits, although some of these may be revealed only under a life stress not typically encountered or hard to test in the lab.

At this point we want to broaden our understanding, and use computational models to understand more complex biological processes, in particular those where the system itself changes over time, typically: neuronal or developmental plasticity. In both cases, the genome isn’t translated into the final structure, but the genome encodes a process that controls an ever changing system. In terms of developing a computational model, it becomes less about specifying a form, but about specifying the rules that control a dynamic system. But even that would not really capture natures complexity, the challenge becomes to specify rules that specify rules, which control rules, which are all codependent and interacting. Or maybe this isn’t necessary, and simple models are already sufficient and there is no additional effect on evolutionary dynamics or adaptation. Quite frankly, we don’t believe that. What controls rate of adaptation, and at what fixed point one arrives depends strictly on how the fitness landscape is explored, and it is mutations that facilitate that. If mutations all have a direct effect on how they move the organism around in the FLS then you will have a local exploration. If the effects are random, the organisms would jump around randomly in the FLS. Consequently, a complex nested system that has neural or developmental plasticity will not only move around the fitness landscape in strange ways, it will also start at one point in the landscape, and due to it’s lifetime adaptability move a different point over it’s lifetime, and the genes control how this lifetime movement happens.

If we want to study how animals and their traits evolve, we need to consider how genes affect developmental processes, and model such processes in our simulations. Development proceeds by signals between cells (or other components); the strength of these signals is specified by genes, but the consequences for traits depend on the interaction of any modified signal with all the others active in the same place at the same time. Of course we need to abstract from the complexity of nature.

Other considerations suggest that we should explore this. Evolution typically selects via very many criteria simultaneously. Although research over the past few decades has shown how evolutionary algorithms work for a single criterion and to some extent for two, we have little idea about how to select for many criteria simultaneously. However this is exactly what happens in animal evolution. As Gerhart and Kirschner argue in The Plausibility of Life, summarizing the work of many evo-devo labs, the flexibility of a metazoan to adapt simultaneously to many different criteria and changing selective forces depends on indirect and emergent mechanisms generated by exploratory processes, and ‘weak linkages’, both specified by genes. If we want our simulations to be relevant to the relation between molecules and animal forms or behavior, we should simulate such mechanisms.

We think this approach will likely also shed light on one of the major issues in human molecular genetics. Despite the promise of the Human Genome Project to identify the genetic variants that contribute to complex human diseases and thus to clarify the molecular processes that drive such diseases, little actionable knowledge has been accumulated nearly 20 years on. What evidence we have about haplotype blocks and purifying selection suggests that many disease-related variants have actually been selected for, rather than against. Such observations cannot be explained by the kind of ‘one gene/one trait’ models often studied, but they are entirely consistent with the many and varied selective pressures that are brought to bear on a complex long-lived animal in changing circumstances.

Similar considerations hold for evolving behavioral traits. It is easier in simulations to specify discrete behaviors through distinct genes rather than to simulate the processes that produce behavior, and so such simulations are a natural first step. But as we want our simulations to be more relevant to animal behavior, then we need to consider the development of the nervous system, and the history of learning, both whose processes, but not outcomes, are specified by genes.

In the last few years we at BEACON have made considerable progress in understanding how complexity can emerge through the evolution of simple traits; we have expanded our repertoire of computational tools; and we learned to work closer and better across disciplines. Now we contemplate the prospect of researching the layers of complexity possible through evolving emergent systems; it is mind boggling, as we open a door to glimpse at what nature holds in store for us. We look forward to being able to open this door even further with the support of BEACON and critically discussing insights with our community!

Posted in BEACON Researchers at Work | Tagged , , , , , | Comments Off on Studying the Evolutionary Dynamics of Emergent Phenotypes

Rock the Chalk: Reevaluation of dropping PowerPoint for a large lecture classroom

This post is written by MSU faculty Chris Waters

I am the course administrator and sole course instruction for the junior/senior “MMG 431:Microbial Genetics” course at MSU. This is a large lecture course consisting of ~150 students. My goal is for the students to learn at a molecular level the fundamental principles underlying microbial genetics, and develop skills in formulating genetics-based scientific approaches to understand biological systems. The Fall of 2016 was my eighth year teaching the course, and the fourth year that I was the sole instructor. I am passionate about the material, and I love to see the students blossom into new microbial geneticists as the semester progresses!

Fig. 1. Actual lecture from 2016 with a pre-class riddle. Can you guess it?

Due to an inattentive classroom and propensity to add too much information and details, last year I dropped using PowerPoint for my lectures and instead switched to a “chalk-talk” style of teaching. I do this using OneNote and a touch screen laptop projected on two screens (example in Fig. 1) only occasionally using slides for very complicated structures. I also have multiple in-class active learning questions that utilize iClickers, and weekly online homework. I reported an analysis of the results from my first year of teaching in this format last year in this blog: 2/2016-Point Break: My experiment with dropping PowerPoint in a large lecture course. To summarize this blog, I saw an improvement in all outcomes including student review scores, positive versus negative student comments, and increased student grades relative to the previous two years of using PowerPoint. However, with a n=1 it was possible that the results were an anomaly. In this blog, I repeat the analysis to ask if dropping PowerPoint really did improve course outcomes.

This ain’t my first rodeo

The second time I used the chalk-talk approach to teach my course was unsurprisingly much easier. I had already distilled previous PowerPoints into the pertinent notes, and I spent less time preparing for class and was more comfortable with each lecture. When teaching with this style it can be easy to forget or leave off important points over time, and one must be cautious to reexamine each lecture. I also upgraded my laptop to a Microsoft Surface Book (love it!), and I could write notes more naturally with the touchscreen pen. I also implemented pre-class riddles or puzzles to get their brain juices flowing (see Fig. 1). This was a lot of fun and helped to make the class environment more relaxed. Finally, I proceeded slower through the material not worrying too much about staying on track with the syllabus. This led to a couple of topics that didn’t get covered, but overall improved understanding. In fact, this is the first year that I have ever had the comment “I wish we could have gone faster”. Add that to the list of unbelievable things that happened in the Fall of 2016.

Fig. 2. SIRS aggregate categories

Student evaluations-What did they think?

All students in MSU courses are required to fill out SIRS evaluation forms. These consist of 21 questions that are scored from 1 to 5 with 1 being the top score and 5 being the worst score. The scores are then aggregated into the six general categories shown in Fig. 2. I am happy to report that all five categories exhibited lower scores relative to the previous year and these were my best SIRS scores in 8 years of teaching. One could argue that this is due to increased experience with the course; however, by the seventh and eighth year of teaching this course I think that experience is not much of a factor, and I’ll note that scores in 2013 and 2014 before I made the switch were stable.

Fig. 3. Net difference in aggregate SIRS scores for the six categories.

To determine which category benefits the most from dropping PowerPoint, I subtracted the lowest scores (all in 2016) from the maximum scores (either in 2013 or 2014) in each category (Fig. 3). “Enjoyment of the Course” showed the greatest gain in student evaluation scores followed by “Instructor Involvement” and “Course Demands”. We all want our students to have fun and share in the passion that we have for the material, so I am quite pleased that course enjoyment has improved so much!

Fig. 4. Grouping of specific student’s SIRS comments into negative, positive, or neutral.

The final data that can be gleaned from SIRS reports, which is perhaps the most important, is taken from the specific comments section. I analyzed the specific comments provided by the students and grouped them into positive, negative, or neutral categories. The results are surprisingly consistent with 2015 showing that the majority of responses were positive, which is a dramatic shift from 2013 and 2014 when I used PowerPoint (Fig. 4). The negative comments fell into three general categories: 1) it is difficult to take notes/lectures should be posted, 2) the homework is not helpful/hard, and 3) more guidance should be given for test preparation. I was inspired to read in some of the positive comments about students who had no interest in microbial genetics before the course but were now motivated to learn more about the field and possibly pursue studies in this area!

The rubber meets the road-How were the grades?

Fig. 5. Score breakdown from 2013-2016.

It is well and good that the students enjoy the class, but they are there to learn. So perhaps the most important indicator of success is the breakdown of the final grades, which is shown in Fig. 5. The most interesting observation was that the 4.0 group had the largest numbers of students consisting of almost 25% of the course. Compared to 2015, there were fewer mid-range students in the 3.5 to 3.0 categories, and more students in the 1.5 to 0.0 group. This is surprising, and a bit disappointing, considering that I covered less material in 2016 than in 2015.

No going back

The analysis of the second year of chalk-talk teaching suggests that the improved outcomes versus PowerPoint lectures were real, and I will continue to teach in this manner. I did give two recorded lectures in the Fall of 2016 due to travel, and many students commented that it was helpful to be able to pause, rewind, and review parts of the lecture to assimilate the information. For Fall 2017, I plan to post my lecture notes and record my lectures to help those students who have trouble taking notes during class, and perhaps try a flipped classroom approach to incorporate more active learning. It can be a bit scary to drop PowerPoint, but it has been the best decision I have made as a teacher at MSU.

Posted in BEACON Researchers at Work, Education | Tagged , , | Comments Off on Rock the Chalk: Reevaluation of dropping PowerPoint for a large lecture classroom

Bio-inspired computation

IMG_7747BEACON Distinguished Postdoc Amir Gandomi was recently interviewed about his work in Zygote Quarterly, after his talk at the 1st NASA Biomimicry Summit and Education Forum, called “an uncommon event of cross-fertilization co-sponsored by the NASA Glenn Research Center, Great Lakes Biomimicry, and the Ohio Aerospace Institute.” The full proceedings of the summit will be published by the Virtual Institute for Bio-inspired Exploration.

Would you please tell us about nature-inspired computation?

Nature-inspired computation (NIC) has been widely used during the last two decades and has remained a highly-researched topic, especially for complex real-world problems. The NIC techniques are a subset of artificial intelligence, but they are slightly different from the classical methods in the sense that the intelligence of NIC comes from biological systems or nature in general. The efficiency of NIC is due to its significant ability to abstract key principles of evolution in nature, which has shown its capability over millions of years.

What impact do you hope/expect/intend your conference talk to have on your profession and/ or others? How will it advance the field?

First, I hope I could encourage people to consider natural computing for their problems. I have shown nature-inspired computation applications in three main stages of engineering systems including modeling, optimization, and monitoring, and I intended to express its applicability to a wide range of problems such as engineering optimization and (big) data mining. To me, all real-world problems can be defined as optimization problems as we have objective(s) and variables in all of them. Therefore, I hope people also will get a sense that nature-inspired computation can be used for their problems even if they are dealing with a complex and blackbox system. Finally, I would like to encourage researchers to look at nature as a well-trained system and consider it as a source of inspiration no matter what their topics.

What are your impressions of the current state of bio-inspired computation?

Nowadays and after reporting successful cases, more researchers are interested in bio-inspired computation, and several centers have emerged recently focusing on these topics. Our center (BEACON Center) is an NSF science and technology center which focuses on bio-inspired approaches and bio-inspired computation in particular. This center is a consortium of five different universities which shows high interest in such a topic which also increases every day.

Read the whole interview here!

Posted in BEACON in the News | Tagged , , , | Comments Off on Bio-inspired computation

Big things happen in small rodents: grasshopper mice as a model for the evolution of pain resistance

This post is written by MSU grad student Lauren Koenig

Lauren Koenig trapping small mammals for a previous field project in Colorado

Life in the desert is full of extremes. Daytime temperatures are scorching, monsoon rains are torrential, and plants are sparse and spiky. Yet many desert animals, such as grasshopper mice (Onychomys torridus) and pinacate beetles (Eleodes longicollis), are able to thrive in these conditions.

So what is the key to survival? For these species, it is the development of extreme adaptations in response to their environment, as well as to each other.

The evolution of adaptations and counteradaptations between two species is known as an evolutionary arms race. This phrase may aptly bring to mind an image of two countries duking it out, building bigger and better nuclear missiles. In the case of mice vs. beetles, the beetle’s missile is a nasty chemical spray consisting of benzoquinones and the launchpad is at the base of its abdomen. When threatened, pinacate beetles will do a headstand, a position best suited to target an oncoming predator’s eyes, nose, and mouth. While pinacate benzoquinones are not deadly to humans, they wreak havoc on sensitive tissue. For a much smaller mammal, benzoquinones will burn and blind on an even more intense level.

A grasshopper mice attacks a pinacate beetle

In turn, the grasshopper mouse appears to have evolved a superhero trait of its own. The well-known entomologist Thomas Eisner first described this attack and claimed that grasshopper mice avoid being sprayed by placing the beetles butt-end in the sand so that the glands discharge harmlessly into the soil1. I’m not so sure that the case is so cut and dry, however. Using its front paws, the mouse will try and grab the beetle, keeping it still long enough to take an incapacitating bite. This manipulation isn’t enough to prevent an onslaught of chemical spray to the face and the mice show signs of discomfort (i.e. grooming, burying behavior). Both Eisner and I can agree, however about the truly remarkable end to this battle. Grasshopper mice are not deterred from their pursuit. They are swift, vicious, and persistent carnivores. The desert floor is littered with the empty shells of pinacate beetles that met a similar demise at the hands of the only rodent species that seems to consistently withstand the spray. Deer mice, the facultatively insectivorous cousins that share the same habitat and encounter the same insects as grasshopper mice, are much less persistent in their pursuit of pinacate beetles and consume them far less often2.

So what makes grasshopper mice such rare rodents? What secrets lie in their physiology that make them less like Mickey Mouse, and more like Monty Python’s killer rabbit?

It turns out that grasshopper mice have some very weird and fascinating responses to pain. In addition to pinacate beetles, they prey on all the classic horror film stars, like tarantulas, centipedes, and bark scorpions, which possess one of the most painful stings. An elegant study by Drs. Ashlee and Matt Rowe discovered that scorpion venom binds to a sodium channel receptor that switches the venom’s effect from pain to that of an analgesic3. Essentially, the mice use the scorpion’s own defense mechanism as the very tool that allows them to successfully eat scorpions, no matter how many times the mice are stung. Deer mice, in comparison, won’t survive long after the first sting.

Benzoquinone, however, does not target sodium channels. It is likely that benzoquinone targets TRPA1, a conserved calcium channel in the nose and mouth that is found across the animal kingdom, ranging from humans to drosophila. It mediates reception of pain, temperature, touch, spice, and caffeine, among others. This is an excellent starting point to begin exploring the mechanism for pain resistance in grasshopper mice – we know that they exhibit reduced sensitivity to formalin, another TRPA1 agonist3. TRPA1’s versatility ensures that any organism that develops an antipredator system through TRPA1 disruption could target many predators with a single stroke.  In response, a predator that had a modified TRPA1 channel immune to that disruption could take advantage of prey that is inaccessible to most of its competition.

Here is where things get interesting. There’s a reason why most animals have not evolved resistance to pain. Pain serves a critical function in the nervous system to warn the body of potential damage (i.e. it tells you not to walk on a broken foot so that the foot can heal). Prey, like scorpions, take advantage of this in order to signal that they are harmful. If an animal lives to eat again it likely learns to never, ever try and eat a scorpion – and that’s a good thing for both the predator and the prey. Therefore, the crucial function of pain in survival ensures that natural selection favors pain sensitivity. So how and why do pain-pathway adaptations exist?  

It is at the junction of this paradox that I aim to pursue my graduate research. By studying grasshopper mice as the exception to the pain-pathway standard, I hope to learn about the underlying mechanisms behind their pain tolerance.

Why should we care about a rodent sized war happening in the middle of the Arizona desert? As researchers discover more about TRPA1, we’re realizing that this receptor plays an even more integral role in the nervous system than previously thought. TRPA1 receptors serve many types of functions and are found even in humans. Not all the signals they send are welcome, however, as TRPA1 is involved in inflammatory, neuropathic, and migraine pain, as well as airways diseases and diabetes4. The more we learn about TRPA1, the more we can learn about our own responses to pain and how to block it. By studying animals in which pain blockage is successful, perhaps we too can someday swallow a pill-sized dose of grasshopper mouse superpowers.

References

  1. Eisner, T., & Meinwald, J. (1966). Defensive Secretions of Arthropods. Science, 153(3742),1341–1350.
  1. Parmenter, R. R., & Macmahon, J. A. (1988). Factors limiting populations of arid-land darkling beetles (Coleoptera: Tenebrionidae): predation by rodents. Environmental Entomology, 17(2), 280-286.
  1. Rowe, A. H., Xiao, Y., Rowe, M. P., Cummins, T. R., & Zakon, H. H. (2013). Voltage-gated sodium channel in grasshopper mice defends against bark scorpion toxin. Science, 342(6157), 441-446.
  1. Nassini, R., Materazzi, S., Benemei, S., & Geppetti, P. (2014). The TRPA1 channel in inflammatory and neuropathic pain and migraine. In Reviews of Physiology, Biochemistry and Pharmacology, Vol. 167 (pp. 1-43). Springer International Publishing.
Posted in BEACON Researchers at Work | Tagged , , , , | Comments Off on Big things happen in small rodents: grasshopper mice as a model for the evolution of pain resistance

The Poetry of Scientific Experiments

This post is written by UW grad student Sonia Singhal

TL;DR: Like poems, “beautiful” scientific experiments have a cohesive, coherent structure where each part reinforces the whole. In this post, I analyze the structures of the poem “Easter Wings” by George Herbert and the Meselson-Stahl experiment from biology. This work is the opinion of the author, and does not necessarily reflect the views of BEACON or its researchers.

Throughout history, people have said that science removes the beauty from the world. Yet scientists often find great beauty in their work. Scientific experiments have even been called “beautiful” by other scientists. Why is this? What is it about certain experiments that gives the perception of beauty?

I propose that one element may lie in the structure of the experiments. Specifically, we perceive beauty when the individual pieces reinforce each other to form a cohesive, coherent whole. In this way, a “beautiful” scientific experiment is akin to a work of poetry.

In poetry, perhaps more than in any other type of writing, structure (or, as a poet would say, form) is paramount. Poetry is judged not only on what the poet says, but also on how she says it. Aspects of the how include how a reader would recite the poem – for example, the rhyme scheme and the rhythm – as well as the look of the poem on the page. Are the lines short or long? Are they on the left-hand side of the page or the right, or do they straddle the center line? Even white space is meaningful.

A concrete poem, whose shape reflects its subject, takes form to an extreme. I’ll use the concrete poem “Easter Wings,” written by George Herbert in the sixteenth century, to illustrate why structure is so important in poetry.

Lord, who createdst man in wealth and store,
Though foolishly he lost the same,
Decaying more and more,
Till he became
Most poore:
With thee
O let me rise
As larks, harmoniously,
And sing this day thy victories:
Then shall the fall further the flight in me.

My tender age in sorrow did beginne
And still with sicknesses and shame.
Thou didst so punish sinne,
That I became
Most thinne.
With thee
Let me combine,
And feel thy victorie:
For, if I imp my wing on thine,
Affliction shall advance the flight in me.

Herbert’s poem describes the second coming of Christ (a popular subject for writers of the time). However, the beauty of the poem does not come (solely) from its subject. It also comes from the way the poem’s structure reinforces its message and themes.

Since this is a concrete poem, let’s start with the general shape. When you look at the poem from a distance, without reading the words, what does the shape remind you of? (Hint: Tilt your head sideways.)

Herbert’s poem has two stanzas (blocks of text separated by white space), each with a winged, bird-like shape. Herbert achieves this visual effect by altering the length of the lines. He starts the stanza with a line of ten syllables. Each subsequent line has fewer and fewer syllables, until the lines in the middle of the stanza have only two syllables each. Herbert then increases the number of syllables per line until he returns to ten syllables in the final line of the stanza.

When we pair the shape with the text, we find that where the lines contract and expand is not arbitrary. The lines contract when Herbert talks about mankind or himself, made “less” (according to Christian doctrine) by the curse of original sin. In particular, the shortest lines contain words that denote dearth or scarcity: “poore” and “thinne.” One interpretation of the short lines might be that they represent the narrowness or short-mindedness that people can exhibit when they only think about themselves. In contrast, the lines expand when Herbert talks about his faith in God and Christ, which makes him part of something greater than himself.1

Herbert repeats structural elements within his poem. Most obviously, there are two stanzas, each with the appearance of a flying bird. Herbert repeats words and rhymes between the stanzas. Rhymes ending in “-ame,” such as “shame” and “same,” appear in the first half of both stanzas, while rhymes ending in “-ee,” such as “thee” and “me,” appear in the second half.2 The second half of both stanzas also includes “victory,” “flight,” and other words that relate to birds (“larks” in the first verse; “wing” and “imp” in the second. “Imp” is a falconry term – when you imp a bird with broken or clipped wings, you attach new feathers to its wings to allow it to fly again). The references to flight coincide with Herbert’s recovery of hope and faith. Even the last lines of both verses are structurally similar (compare “Then shall the fall further the flight in me” to “Affliction shall advance the flight in me”). The repetition makes the poem feel tightly knitted and could represent the strength of Herbert’s faith.

In “An Essay on Criticism,” eighteenth-century poet Alexander Pope said of poetry that “The sound must seem an echo to the sense.” In other words, structure in poetry must emphasize the meaning. Herbert’s poem exemplifies this philosophy. The bird-like shape of the poem emphasizes the themes of rebirth and renewed faith, while the tight repetition of words and phrases creates a sense of safety and security. The structure and the meaning of the poem resonate with one another, and this resonance makes the poem “beautiful.”

The value of structure extends beyond poetry. The scientific process has its own inherent structure:

  1. You ask a question.
  1. You form a hypothesis, or a reasonable explanation, about the answer.
  1. You run an experiment.
  1. The experiment gives you data (results) that either support or refute your hypothesis.

When each of these pieces (question, hypothesis, experiment, results) reinforces the others with no extraneous content – in other words, when the structure of the whole experiment echoes its core message – the experiment may resonate with us in the same way that a well-constructed poem does.

To explore this idea, I’ll use a classic experiment in biology, the Meselsohn-Stahl experiment, which was fundamental to our understanding of how DNA works.

DNA is a molecule found in all living things. Each time a cell divides, or a parent has a child, the parent’s DNA gets copied and passed on to the next generation. Because the building blocks of DNA were relatively simple compared to other molecules, its importance was originally underestimated. But in the first half of the twentieth century, the results of biologists’ experiments began to indicate that DNA was in fact necessary for life. It encoded information about whether a living being was human, animal, or plant, what it looked like; and how its body worked.

DNA’s shape was first proposed in 1953, by James Watson and Francis Crick.3 At the time, they knew that the building blocks of DNA included four different bases that contained nitrogen (specifically, adenine, cytosine, thymine, and guanine). Watson and Crick also knew that, in any particular DNA molecule, there were equal amounts of adenine and thymine bases, and equal amounts of cytosine and guanine bases, but different amounts of adenine/thymine versus cytosine/guanine. Based on a photograph of X-rays bouncing off DNA taken by Rosalind Franklin, Watson and Crick suggested that DNA was made of two strands that wound around each other in a double helix (Fig. 1). The bases were arranged in pairs between the two strands like rungs on a ladder: adenine with thymine, cytosine with guanine.

Figure 1. Basic shape of a DNA molecule. The two strands (yellow) form a double helix. Between them, adenine (green) pairs with thymine (purple), and cytosine (red) pairs with guanine (blue).4

This arrangement immediately suggested a way of copying DNA without losing the information it encoded. Because the bases are paired, it is possible to determine the sequence of one DNA strand from the other’s: A thymine on one strand always corresponds to an adenine on the other, while a guanine on one strand always corresponds to a cytosine on the other. If you separate the strands, you can make an exact copy of each one to get a new DNA molecule.

However, base pairing also meant that different researchers came up with different ideas on exactly how the process of replication, of copying and creating a new DNA molecule, might work. They suggested three different hypotheses (Fig. 2):

  1. Semiconservative replication. The original DNA strands separated and were copied, and the new DNA molecules were made of one strand of original DNA and one strand of new DNA.
  1. Conservative replication. The original DNA strands separated and were copied, but afterwards the original DNA strands re-paired, and the newly made DNA strands paired with each other. One molecule was made up only of the original DNA, and the other was made up only of new DNA.
  1. Dispersive replication. The process of copying DNA involved making short segments from both strands – in other words, copying some of one strand, then some of the other, then more of the first. Molecules made this way would contain some original DNA and some new DNA in both strands.

Figure 2. Illustration of three hypotheses for how DNA might replicate.5

In 1957, Matthew Meselson, a graduate student, and Frank Stahl, a post-doctoral researcher, designed a simple experiment with bacteria to determine which of these hypotheses was correct.6 They took advantage of the fact that nitrogen, which appears in the DNA bases, has a heavy form and a light form. Normally, the lighter form of nitrogen appears in living things, so using the heavy form of nitrogen in DNA would let Meselson and Stahl track specific strands. In their experiment, they put the heavy form of nitrogen into the DNA of bacterial parents, but they only let the bacterial children and grandchildren use the light form of nitrogen to make DNA copies. The weight of the resulting DNA would reveal how it had been copied.

  1. In semiconservative replication, the weight of the DNA would change with each generation. In the first (child) generation of bacteria, new DNA molecules in the child bacteria would be made of one strand of old (heavy) DNA and one strand of new (light) DNA. DNA in both parents and children would have an intermediate weight.

    In the second (grandchild) generation of bacteria, new DNA molecules could be copied from both the old (heavy) and new (light) strands. There would be a mixture of intermediate-weight DNA (copied from heavy strands) and light-weight DNA (copied from light strands).

  1. In conservative replication, the parent would still have only heavy DNA, and the children and grandchildren would have only light DNA.
  1. In dispersive replication, all new DNA molecules would have some old DNA and some new DNA. All molecules would be of intermediate weight in parents, children, and grandchildren.

When Meselson and Stahl weighed the DNA before and after the bacterial parents reproduced, the results were indisputable. Before reproducing, the parents only had heavy DNA. After the first generation, parents and children had DNA of equal, intermediate weight. After the second generation, Meselson and Stahl saw both intermediate-weight and light-weight DNA. DNA was being copied in a semiconservative manner.

Even before the results of the experiment had been formally published, scientists called Meselson’s and Stahl’s experiment “beautiful.” Meselson said the results were “clean as a whistle.” Maurice Wilkins, a physicist and molecular biologist, described their paper on the experiment as “elegant and definitive.” Another molecular biologist, Gunther Stent, said that “it really tells the whole story.” Stahl noted, many years later, that he had “been trying to do something half as pretty ever since.”7

In the same way that the form of “Easter Wings” reinforces its meaning, the tight, self-contained logic and cohesiveness of the Meselson-Stahl experiment strengthens its core message. Structure is a little more difficult to illustrate in an experiment than a concrete poem, so I’ll start with a reductionist approach.

Suppose we strip the experiment down to its minimal necessary information: the question (How does DNA replicate?) and the answer (Semiconservatively). I would argue that these are bare facts; on their own, they would probably not be considered beautiful. Let’s add in the hypothesis, which gives details on the copying mechanism (the DNA helix unwinds, and a new DNA strand is copied from each original strand. New DNA is thus made of one strand of old DNA and one strand of new DNA). Now the answer makes a little more sense. The explanation is logical, and it matches the pairing of bases in DNA. Next, we add back the experiment – how you go about testing this hypothesis (start only with heavy DNA and make new, light DNA from it. Then track how much of each type of DNA, heavy and light, there is over time). Under different hypotheses of DNA replication (semiconservative, conservative, or dispersive), we expect a different result, so this single experiment will immediately let us rule out two of the three. Finally, we add back the results (from heavy DNA, we go to DNA of intermediate weight, then a mixture of intermediate-weight and light-weight DNA). One hypothesis (semiconservative replication) is supported; the others are rejected.

By breaking the experiment down in this way, we can begin to understand how the individual pieces of the Meselson-Stahl experiment – question, hypotheses, experiment, and results – work together to form a coherent whole. First, every piece centers on a single issue (how DNA is copied), giving internal cohesion. At the same time, each new piece gives us additional information, providing the impression of direction. Finally, there is closure and completeness: The possibilities suggested by the three hypotheses are neatly tied up by the results. By the end of the experiment, we have come full circle with the answer to our question. The pieces of the Meselson-Stahl experiment strengthen and resonate with each other, and we perceive this resonance as beautiful.

Structure is important to a work in any genre – science or art – because it allows us to organize our understanding of the work’s messages and themes. When the structure focuses around and reflects the themes in some way, it provides additional power to the work. In this way, a close examination of structure can give us another lens through which to evaluate “beauty” across disciplines.

Notes

1 The same argument works on a theological level as well. The title, “Easter Wings,” tells us that this is a poem in celebration of Easter, or the resurrection of Christ after his death. In a reading from this perspective, short lines represent death, while long lines represent life or resurrection. The poet’s spiritual death and rebirth parallel the death and rebirth of Christ.

2 Six of the ten “-ame” and “-ee” rhymes even involve repetition of entire words: “became,” “me,” and “thee.”

3 Watson, J. and Crick, F. 1953. Molecular structure of nucleic acids. Nature 171:737-738.

4 Credit: DNA_simple.svg by user Forluvoft, Wikimedia Commons.

5 Credit: DNAreplicationModes.png by Mike Jones, Wikimedia Commons.

6 Meselson, M. and Stahl, F.J. 1958. The replication of DNA in Escherichia coli. PNAS 44:671-682.

7 Quotations from Holmes, F.J. 1996. Beautiful Experiments in the Life Sciences. In Tauber, A.I. (ed.) The Elusive Synthesis: Aesthetics and Science. Kluwer Academic Publishers, Dordrecht, pp. 83-101.

Posted in BEACON Researchers at Work | Tagged , , | Comments Off on The Poetry of Scientific Experiments

Historic $12.7 million gift to BEACON Center, MSU College of Engineering

John R. Koza, who is considered the "father of genetic programming," has donated $12.7 million to the Michigan State University College of Engineering -- the college's largest individual gift.

John R. Koza, who is considered the “father of genetic programming,” has donated $12.7 million to the Michigan State University College of Engineering — the college’s largest individual gift.

The Michigan State University College of Engineering has received its largest individual gift in the history of the college.

A $10.7 million bequest from a California entrepreneur joins a previous cash gift of $2 million, bringing his total giving to $12.7 million to support the college and the BEACON Center for the Study of Evolution in Action, one of the National Science Foundation’s Science and Technology Centers.

The commitment is from computer scientist John R. Koza, who is considered the “father of genetic programming.”

The $10.7 million bequest will fund two endowed faculty positions to attract eminent scholars for the development of computational tools inspired by nature. New endowments also will advance genetic programming and evolutionary computation through endowed prizes, fellowships and programs to attract top graduate students and an increasingly strong pool of faculty members, said engineering Dean Leo Kempel.

“The creation of two new faculty endowments joining a third endowed chair, as well as endowed prizes and graduate student support, is unprecedented in the College of Engineering,” Kempel said. “We are very grateful to Dr. Koza for the advances our faculty will achieve and the students we will serve as a result of this extraordinary gift.

“With this gift,” Kempel continued, “and the previous investment by the National Science Foundation in the BEACON Center, Michigan State University will be the leading institution for transformational research and education in this important field of scholarship.”

Koza said he is delighted to make the investment in the BEACON Center and the College of Engineering and believes they are the best place to carry forward his life’s work.

“The mix of private support, NSF support, and backing from MSU, under the guidance of my good friend and colleague, Erik Goodman, means the BEACON Center and its ground breaking work will continue for many years to come,” Koza said. “My personal connections to BEACON, MSU and the partner institutions have been very gratifying and I look forward to what we can do together.”

MSU President Lou Anna K. Simon said the gift will create a hub of expertise and excellence in a demanding and promising field.

“John Koza’s continued generosity will empower us to build on his pioneering work. We are thankful for his vision and investment in the research and learning being done at Michigan State, which will resonate far into the future.”

Koza’s $2 million cash gift was received in 2014 and created the John R. Koza Endowed Chair in Genetic Programming. In August 2016, MSU welcomed renowned specialist in genetic programming and evolutionary computation Wolfgang Banzhaf as the Koza endowed chair.

Koza is a computer scientist and pioneer in the use of genetic programming, or GP. For much of his career, he was a consulting professor at Stanford University, teaching classes about evolutionary computation and genetic programming while conducting his research in that field.

“His ideas have helped to push back the horizon of what we believe computers can do now and in the future,” said Erik Goodman, director of the BEACON Center for the Study of Evolution in Action. The BEACON Center unites those who study natural evolutionary processes with computer scientists and engineers to solve real-world problems.

Goodman, who has been friends with Koza since they were graduate students in the 1960s and 1970s, called Koza a brilliant computer scientist.

“John Koza is frequently called the father of GP. His publication of four gigantic books introducing genetic programming to the world, beginning in 1992, helped to earn him this accolade,” Goodman explained. “In his books, he introduces the concepts of automated programming of computers by evolutionary processes.”

Goodman said Koza’s early work also included organizing a series of international conferences on Genetic Programming, which he and Goodman helped later to merge into a broader conference series on evolutionary computation, the Genetic and Evolutionary Computation Conferences.

“All of us owe some of our inspiration to the successes achieved by Koza,” Goodman added. “In the end, there may be few of us whose lives are not touched in some way or another by John Koza’s work.”

Endowed funds allow the university to provide continual support to specific programs and projects. The gift’s principal is invested and a portion of the annual earnings is used for annual program support.

The gifts support Empower Extraordinary, the $1.5 billion campaign for MSU that launched publicly in October 2014. To date, the College of Engineering has raised more than $76 million of its $80 million campaign goal.

Posted in BEACON in the News | Tagged , | Comments Off on Historic $12.7 million gift to BEACON Center, MSU College of Engineering