BEACON Evolved Art Competition Results

For the past three months, participants in BEACON’s evolved art competition have been using evolution to create art pieces that resemble the BEACON lighthouse. “How is that possible?” you ask?

Each entry started as a random image that looked something more like abstract art than a lighthouse.  

A random initial image from Picbreeder (www.picbreeder.org)

A random initial image from Picbreeder (www.picbreeder.org)

On Picbreeder, evolutionary artists can select one image from a group of images to “reproduce” into the next generation. Following an evolutionary process, Picbreeder then creates 15 copies of that image (“offspring”), each with a slight change to the image (a “mutation”).  Of course, since the artist, rather than nature, is doing the selecting, this is a form of directed evolution, not natural evolution. 

The 15 copies of the original random image with slight changes (“mutations”)

The 15 copies of the original random image with slight changes (“mutations”)

As you can see above, some mutations make the image look more appealing, whereas other mutations make the image look worse than it originally was. Some mutations don’t make the image look very different at all, which is of course important if the evolutionary artist doesn’t want to lose the image she just made copies of. 

After successive generations of “breeding” the images, the evolutionary artists started to evolve images that looked more and more like the BEACON lighthouse—or at least something evocative of it—until they reached the images shown below. These entries highlight the creative power of evolution: beautiful art pieces can be created by accumulating the right mutations over time!

While none of the entries look exactly like the BEACON lighthouse, it is important to note that the goal of the competition was to evolve an alternative lighthouse image. If the goal was instead to evolve an exact copy of the existing logo, the competition would have likely failed: Even if we clearly define a specific goal for directed evolution, and even if we have full control over selection (such as when humans breed animals), evolution may not produce the desired result. The reason is that although certain characteristics can be selected when they appear, there is no guarantee that they will appear. This is unsurprising when you think about it, in that we would not expect a human to be able to breed a Tyrannosaurus rex from a beagle even when given millions of years to do it. 

We received 50 entries, which were judged by a committee of 6 BEACON scientists and educators: Connie James, Masoud Mirmomeni, Randy Olson, Rob Pennock, Jory Schossau, and Allison Walker. The committee selected the top three winners and a few entries that were too good not to highlight. Enjoy looking through the top entries!

1st Place

This entry not only had the lighthouse, but a beautiful beam of light shining out from it.

This entry not only had the lighthouse, but a beautiful beam of light shining out from it.

2nd Place

Surprisingly, this entry has a lighthouse that looks like it is on a peninsula. The lighthouse even has a reflection in the water!

Surprisingly, this entry has a lighthouse that looks like it is on a peninsula. The lighthouse even has a reflection in the water!

3rd Place

This entry has a foggy look to it, with the lighthouse shining in the background.

This entry has a foggy look to it, with the lighthouse shining in the background.

Honorable Mention:

This lighthouse has two beams shining out from it.

This lighthouse has two beams shining out from it.

This entry has a series of gradually shrinking lighthouses. It also looks a little like a Disney castle!

This entry has a series of gradually shrinking lighthouses. It also looks a little like a Disney castle!

This entry is clearly related to the 1st Place winner, but with a shorter beam of light.

This entry is clearly related to the 1st Place winner, but with a shorter beam of light.

The winners have all been contacted and will soon receive their prizes. Thank you to all of our contest entrants for your participation and creativity!

Posted in Uncategorized | Tagged , | Leave a comment

BEACON Researchers at work: Changing environments / changing organisms

This week’s BEACON Researchers at Work blog post is by University of Washington graduate student Peter Conlin.

Natural selection produces an organism whose phenotype is well matched to its environment. Under a constant environment there should be a single optimum, but what happens in a varying environment when different phenotypes are favored at different times? Environmental change can be seen as a challenge because it may disrupt the match between phenotype and environment, leading to a decrease in fitness. One solution to this mismatch problem is for individual organisms to condition their phenotype on the state of their environment.

The ability of a single genotype to alter its phenotype in response to changes in the environment is called phenotypic plasticity. Phenotypic plasticity can play many important roles in evolution – it can increase fitness, generate novelty, and plays a role in structuring ecological communities. This concept was first applied to changes in morphology but has more recently been extended to include a diversity of environmentally induced changes.

Daphnia

The development of a defensive head shield and tail spines in Daphnia. The first described and arguably cutest example of phenotypic plasticity (from Woltereck, 1909).

A classic example of adaptive phenotypic plasticity, and the first described, comes from the water flea, Daphnia. When grown in the presence of a predator Daphnia will grow defensive head shields and tail spines that are thought to increase their chance of survival by deterring predators.

Phenotypic plasticity is found across all levels of biological complexity and theory predicts that plasticity can be adaptive when the following criteria hold:

Organisms experience different environments (spatially or temporally).

  1. Different environments favor different phenotypes.
  2. The environmental cue provides reliable information about selective conditions.

The third criterion is especially important because in some cases, the phenotypic change must precede the selective conditions. In the case of Daphnia, the defensive head shield must grow before encountering a predator (if it is to be effective) and growing a head shield in the absence of a predator is thought to be costly. Daphnia are known to use chemicals released by their predators as a cue to grow a head shield. Interestingly, researchers have found that an increase in water temperature will elicit the same defensive response in some species!

A vast literature on phenotypic plasticity exists from studies of natural and laboratory populations, much of it from just the past 20-30 years. Experiments have demonstrated that plasticity is an evolvable trait, that plasticity can be directly selected, and have given us a better understanding of the complex genetics of phenotypic plasticity. Even so, a great number of theoretical predictions about the evolution of plasticity remain untested. My research in Ben Kerr’s lab uses an experimental evolution approach to understand the role of environmental cue reliability in the evolution of phenotypic plasticity and what genetic changes occur as plasticity evolves.

The phenotypic plasticity team. From left to right:  Joseph Marcus, Samuel Reed, and Peter Conlin.

The phenotypic plasticity team. From left to right: Joseph Marcus, Samuel Reed, and Peter Conlin.

We are working with a strain of the budding yeast, Saccharomyces cerevisiae, from Will Ratcliff and Mike Travisano at the University of Minnesota that forms multicellular clusters, a phenotype caused by incomplete cell separation. The phenotype we are focusing on is the size of the yeast cluster. We chose this because cluster size varies across isolates and clusters can be easily size-sorted by centrifugation due to differential rates of sedimentation.

Multicellular yeast clusters. This phenotype is found in many natural isolates but also readily evolves when yeast are grown under selection for rapid settling. Multicellular yeast clusters have also recently been evolved by Andrew Murray’s lab under selection for improved utilization of a public good.

Multicellular yeast clusters. This phenotype is found in many natural isolates but also readily evolves when yeast are grown under selection for rapid settling. Multicellular yeast clusters have also recently been evolved by Andrew Murray’s lab under selection for improved utilization of a public good.

In the experiment yeast clusters are grown in a pair of alternating cue environments referred to as E1 and E2. We can favor the evolution of plasticity by pairing the different environments with opposing selective conditions (propagating either the top or bottom of the culture after centrifugation). Cue reliability is high when E1 and E2 are uniquely paired to top or bottom selection, respectively. We can test theoretical predictions about the importance of cue reliability by deviating from the 1:1 correlation between environment and selection. Our hypothesis is that phenotypic plasticity is more likely to evolve when cue reliability is high. We hope that this work will shed light on the conditions that favor the evolution of phenotypic plasticity and the underlying mechanisms for achieving plasticity.

For more information about Peter’s work, you can contact him at pconlin2 at u dot washington dot edu.

Posted in BEACON Researchers at Work | Tagged , , , , | Leave a comment

BEACON Researchers at Work: Evolving division of labor

This week’s BEACON Researchers at Work post is by MSU graduate student Anya Johnson.

anyajohnsonHave you ever looked around you and thought about the amazing feats that organisms accomplish together? The most obvious examples are of course everything that humans have made, but if you look closer there are a lot of species that work together to master their habitat. Eusocial insects (like ants and termites) make incredible structures to live in. I’m interested in how organisms take that large task that needs to be accomplished (the entire termite mound) and break it into small pieces (building just this part of the mound). While humans can easily discuss who is going to do what, many groups of animals have evolved their own strategies to break apart a large task. This process of breaking a problem down into smaller (and easier) pieces can be called problem decomposition.

Photo by Steve Corey: http://www.flickr.com/photos/stevecorey/7281531296/

Photo by Steve Corey: http://www.flickr.com/photos/stevecorey/7281531296/

I am currently using the digital evolution platform Avida to study what processes contribute to the evolution of problem decomposition and how computer scientists could harness problem decomposition to solve very difficult problems in computer science. I’m focusing on how groups of organisms in Avida can coordinate to break complex tasks down into smaller pieces that can be solved by individual organisms. In Avida, digital organisms can perform tasks, such as math problems (6A + 2B) or logic operations (NOT, OR), to get rewards. I’ve set it up so that a group of organisms has to work together to solve a certain number of these tasks before the group as a whole gets to reproduce. This group reproduction dynamic is a lot like a simplified version of an ant colony. Most of the ant colony focuses on working instead of reproducing. Since the queen is the mother of (most) of the workers, the whole colony is closely related. The colony passes on its genes when a new queen is born and goes off to start a new colony. In my experiments, when the group has performed enough tasks, a new group is started from the genetic material of the successful group. Because there are only so many places a group can live, eventually new groups start killing off the old groups. This creates competition between groups of organisms to reproduce quickly in order to keep their lineage alive.

Previously, Heather Goldsby and her colleagues discovered that under environmental conditions that favor division of labor, organisms within groups would communicate solutions to simple problems that could be used as the building blocks to more complex problems. Specifically, for this work, they used a suite of nine logic tasks of increasing complexity., It was faster for a couple organisms to solve the simplest tasks and send those answers to another organism. When that other organism received the messages, it was able to combine the simple answers into a more complex solution. In this way, the group broke down the most complex task into the simpler tasks. In fact, Goldsby et. al found that when the organism that could solve the most complex task was taken out of the group, it no longer could do the task because it relied on the messages from the other organisms.

This was a very exciting thing to discover because evolutionary computation approaches to problem decomposition have been challenging to engineer both within Avida and other systems. We wanted to know exactly what aspects of Goldsby et. al’s project were necessary for problem decomposition to evolve. To figure this out, I started changing different parts of the environment that Goldsby et. al used.

The organisms were all being given the same three inputs to their logic tasks, and we hypothesized that this was important for the evolution of problem decomposition. This is because if the organisms were given different inputs, they wouldn’t be solving the same problems. It’d be pretty hard to combine answers when you aren’t solving the same problem, right? The organisms were also being rewarded when they sent a message that solved a task. Organisms could basically kill two birds with one stone: get more tasks counted as solved for the group’s reproduction and pass on an answer to another organism. We hypothesized that if they stopped getting rewarded for solutions in messages, it was less likely that the organisms would evolve to use messaging and thus problem decomposition would have a harder time evolving.

We were also interested in seeing if the evolution of problem decomposition in this setup would be influenced by the type of tasks the organisms were doing. We started with the basic “logic nine” tasks, which are nine logic operations that build on each other. What if the organisms had to solve math problems instead? We would hope that this method of problem decomposition is robust to the type of problems the organisms are solving, since we want to use it for all sorts of problems. To test how robust this system is, we’ve started testing the Avidians on a set of nine math problems that build on each other (like A+B and 2A+3B). We’re also testing them on the first ten Fibonacci numbers. Those tasks just require the organisms to output one of the first ten Fibonacci numbers (starting at 0). The Fibonacci numbers also build on themselves because it would be easy for the organisms to send two of the beginning numbers to another organism that could add them together to get the next number.

All of these experiments with these groups of organisms is heading toward an ultimate goal of making a problem decomposition system that can automatically decompose any problem given to it. Parallel programming is when a computer program is broken into pieces that can run on different machines (or cores within one computer) in parallel. This makes the program run much faster than if each of those pieces had to be run one after another. Parallel programming is difficult for programmers sometimes because it can be hard to figure out how to break down the program. However, parallel programming basically a problem decomposition task, since the serial program (non-parallelized) just needs to be broken down into subparts and then recombined to solve the most complex task (i.e. the whole program). If we can put the program into a task for the organisms, hopefully, we’ll have a system that evolves a group of organisms that can break that program into its simpler parts. Evolution has a history of finding solutions to problems that humans didn’t think of, and who knows what these organisms will discover about how to parallelize programs. 

For more information about Anya’s work, you can contact her at anyaejo at gmail dot com.

Posted in BEACON Researchers at Work | Tagged , , , | Leave a comment

BEACON Researchers at Work: To What Place Workflowmics?

This week’s BEACON Researchers at Work blog post is by NC A&T faculty member Scott Harrison.

A practical challenge in genomic studies has been for students to conceive of different outcomes concerning variation, and to test these outcomes against data sets that can contain hundreds of genomes and millions of encoded features such as open reading frames (ORFs). Workflows are contexts of software usage that rely upon frequent interactions with a human operator. As students learn to develop and utilize workflows for genomic analyses, they are building essential skills as 21st century knowledge professionals who will be in-demand for the decision-making and expert knowledge required by these workflows. This is an area of effort I suggest be called “workflowmics.”

Many of the biotechnologies and software tools for genomic data capture and analysis that we utilize today will be radically different in a few years’ time. Perhaps some essential databases and software tools will be maintained and updated, but many will not. Even for those computational resources that remain in demand, changing versions of these resources, underlying programming languages, library dependencies, and host operating systems can dramatically alter the stability and features of an expected software inventory. The meaning of data collected across different data formats, data annotations, protocols and personnel will further challenge how evolutionary biologists aspire to integrate data for the purpose of comparisons across life’s diversity.

Some research in computer science on syntactic and semantic resolution may be expected to guide the development of transformative or “disruptive” technologies, and that is an area worth tracking for anyone with an interest in integrative biological disciplines such as evolution. There are however a wide range of existing computational solutions that are proven and reliable. Even the seemingly trivial usage of a spreadsheet can be very effective at syntactic and semantic resolution, and a spreadsheet application is a classic instance of a software tool relied upon for many workflows. A well-constructed workflow prioritizes a robust mode of implementation and validation by the human operator who will have the flexibility to make various on-the-spot decisions, including those syntactic and semantic resolutions that may not have been anticipated in advance.

Within the context of genomic analysis workflows and their human operators, I am interested to address the following two questions. How will scientific advancement, biotechnological power, and industrial competitiveness follow an efficient path? How do we involve “students” (which, in the fast-paced scenario of data-driven biology, means most everyone) in scientific findings, technical solutions, and industrial productivity? My proposed answer is that an efficient path for genomic studies will come from building knowledge of evolution in a student-centered, diverse community. Similar to how many mechanisms of physical systems can be modeled with several laws of motion, the extraordinary scale of genomic data used to chart life’s diversity can be modeled with several evolutionary processes. There is far-reaching potential for the known evolutionary processes of variation, selection and inheritance to be used across contexts of discovery, experimentation, and engineering. In order to reach this potential, there must be an effective approach for data management. Workflows have been necessary to harness the analytical power of huge volumes of empirical data that can be produced with modern biotechnologies, but a limiting factor has been the large amounts of focused time needed for thorough data analysis. A possible remedy would be “crowd-sourcing.” This requires the recruitment and development of a broad wave of scholars who can be the “mortar between the bricks” – a concept and phrase put forward by Judi Brown Clarke at MSU during a BEACON seminar (October 28, 2011).

An inspirational haiku by the blogger.

An inspirational haiku by the blogger.

Enriched interactions with data sets and software help humans to evaluate small-scale and large-scale aspects of biological systems. In my work as a computational biologist, I am interested to know how studying the scalability of biological systems relates to the navigation of various dimensions that are cognitive, institutional, and social: from molecules to communities, from the laboratory to the clinic, and the usage of knowledge across disciplines respectively. These dimensions run parallel to the student-centered Swail geometric model of student persistence and achievement which addresses the retaining of minority students in higher education. The Swail geometric model (the blue “Student Experience” triangle in the figure) is student-centered in that it focuses directly on institutional practices and other factors directly impacting the student experience. By comparison, other models of student retention in the literature have not been based upon how the overall classroom and college experience connects with social factors impacting the student as a central entity.

bioscalabilityandstudents

For the red “Scalability of Biological Systems” triangle, cognition may be thought of as arising from a persevering attention to details, along with conceptualizations that are often pluralistic for biological systems. Moving discoveries in the laboratory to the clinic (or from basic science to application) requires a series of institutional commitments, hopefully leading to positive feedback loops involving credit and external funding. In terms of social factors that will breed success, technical ability can accelerate with collaborations that bring knowledge from other disciplines. An effective approach for collaboration was well-expressed in Luis Zaman’s recent essay, http://beacon-center.org/blog/2013/04/11/consultation-is-not-collaboration/. All of these efforts are data-intensive and may be computationally aided. My own laboratory has a mixture of free and commercial genomic analysis software tools, a clinical data management system, and a cadre of students who are conducting data mining efforts for biomedical studies pursued in collaboration with other laboratories.

There is a societal interest in having the outcomes of scientific research connect to industry. For the shift from the red “Scalability of Biological Systems” triangle to the blue “Student Experience” triangle, a $1,000,000 question is about whether knowledge of evolution leads to economic success that is recognized by social factors. Some of what I do to address social factors in my teaching of evolution is to show how it: 1) helps trigger in-depth understanding of complex material previously encountered which will enable students to become “ace” professionals; and 2) guides students in formulating their own original thinking to arrive at new insights and discoveries. For the classroom and laboratory training experiences I have been developing (see Table), the logistical challenges that are being surmounted by students appear to resonate with the standards of competitiveness advocated by leading organizations such as the National Academy of Sciences. As students collect and analyze empirical data to better illuminate life’s complexity, there is abundant opportunity for students to pick up on profe

ssional skills such as experimental design, professional communication, protocol implementation, project management, and interdisciplinary teamwork. I have described some of these classroom and training approaches below.

Training and Student Experience Research and Economic Thrust
Perl programming workshop on codon usage analysis. Students experience the power of a scripting language for analyzing the textual data of genomics. Analysis of codon usage is a well-studied topic with many hypotheses in the literature. Students assign each other hypotheses to test against recent genomic sequences, and review each other’s work. Use digital experiments of evolutionary mechanisms in AVIDA to examine costs of alterating a simulated codon set. This work is collaborating with molecular biologists who are experimenting with artificial amino acids being introduced into a modified genetic code. This will expand the repertoire of biochemicals that can be synthesized in vivo for industry.
Phylogenetic reconstruction of known evolutionary histories. Students experience the impressive, although imperfect, accuracy of different mathematical models for sequence comparison and tree-building. This initiative is being developed further by hypothesis-driven use of AVIDA, and purification and sequencing of genomic data from the environment. Pursuit of co-evolutionary studies driven by running next generation sequencing data from environmental samples against multiple reference genomes. This effort has broad applicability to topics in agriculture and complex disease. For a comparative context, the roles of nutrient factors and phenotypes of pathogen-host adaptations are being evaluated.

 

Students at North Carolina A&T State University examining phage sequencing data for a phylogenetic reconstruction of a known evolutionary history. This was part of a workshop in the biology department’s undergraduate introduction to research course.

Students at North Carolina A&T State University examining phage sequencing data for a phylogenetic reconstruction of a known evolutionary history. This was part of a workshop in the biology department’s undergraduate introduction to research course.

A challenge for all scientific disciplines to consider is whether we are adequately identifying and recruiting students from our nationwide talent pool. The retention and success of minority students in higher education is a key effort of the Department of Biology at my home institution, North Carolina A&T State University, one of the nation’s largest historically black universities. Over the past decade, there have been several recent advances to be credited to many in my biology department. These include increased numbers of students in undergraduate research, increased numbers of students presenting at national conferences, and increased numbers of students moving beyond their baccalaureate training at A&T to earn PhDs. I have been contributing to this effort by helping students directly engage concepts about life’s complexity with real-world data. This can be done through classroom experiences that transition students from classical training approaches in evolutionary science to the usage and generation of powerful software workflows. Successful implementation of complex workflows is essential for STEM careers that must confront a growing range of challenges in interpreting biological data such as complex panels of biomarkers, complex microbial communities, and complex interactions between parasitic and host organisms. Workflow-based training that is both conceptual and experiential will enable academic institutions to deliver upon increased expectations for productivity and competencies from a professionally trained workforce.

For more information, you can contact Scott at scott dot h dot harrison at gmail dot com.

Posted in BEACON Researchers at Work | Tagged , , , , , | Leave a comment

BEACON Researchers at Work: The Structure of Coevolution

This BEACON Researchers at Work blog post is by MSU graduate student Luis Zaman. 

Luis ZamanIn my first BEACON blog post, I wrote about how we study the diversity producing effects of host-parasite coevolution in Avida. I used a traffic jam metaphor to explain how finding the least-used detour would get you home quickest. This example of negative frequency-dependence is particularly relevant for those of us experiencing Michigan’s Construction Season. In host-parasite communities, that same benefit of being rare can support a diverse set of organisms. I showed you a video from Miguel Fortuna of this diversity where you could actually watch as coevolution produced new interactions and new host or parasite “species.” In the time since, part of my research has been on understanding the structure of those interaction networks and the effects of diverse communities on coevolutionary processes. 

An evolved interaction network of hosts (green) and parasites (red) from Avida. Links represent actual infections between different host and parasite phenotypes (spheres).

An evolved interaction network of hosts (green) and parasites (red) from Avida. Links represent actual infections between different host and parasite phenotypes (spheres).

When looking at the networks of interacting hosts and parasites, we noticed an interesting pattern. The interactions weren’t random; instead they seemed to be nested such that specialist parasites tended to interact with hosts that more generalist parasites also interacted with. One way to understand this structure is to think about Russian Matryoshka dolls (or Russian nested dolls), where the smallest doll fits inside the next smallest, which fits inside the next smallest, all the way to the largest doll. But, instead of dolls, the hosts that the most specialized parasite interacts with are a subset of those hosts the second most specialized parasite infects, which are a subset of the hosts the third most specialized parasite infects, and so on… This pattern is also true for hosts in nested networks, where the most resistant host only interacts with the parasites that have the broadest host ranges. 

Russian_Leaders_Matriochka

Russian Matryoshka dolls. After fully assembled, all the smaller dolls will be inside the largest doll.

From plant-polinator interactions to resource-consumer modules in food webs, nestedness is found in nature nearly every time someone goes looking for it. We are excited to also see it in Avida! That means there is probably something about the evolutionary process that produces this pattern, since nearly everything about Avida’s “biology” is different from other living organisms. Even when we change details about how hosts and parasites interact, something we are uniquely able manipulate in Avida, we still end up with nested networks. We’re still trying to answer why coevolution produces this nested structure over and over again, but luckily Avida makes a particularly useful model system for this type of question. Miguel Fortuna, Aaron Wagner, Charles Ofria and I recently published a paper about the benefits of studying how these interaction networks evolve using Avida in a PLoS Computational Biology Topics Page. There are some interesting things about these topics papers, like how upon publication a copy is put up on Wikipedia for the community to edit and keep “alive,” but that topic is best kept for another blog post. 

Another big part of my time has been spent understanding how the diverse communities that arise affect future evolution. Because this diversity becomes part of the environment for hosts and parasites, it helps shape which traits are beneficial and which are harmful. Predicting what will happen in coevolving communities is extremely difficult because of this feedback, but that’s also what makes it fun and interesting to study. 

We have been wrapping up a manuscript describing a few interesting outcomes of this feedback, but you’ll have to wait for my next blog post to hear more about them. One result I’ll give away as a freebie is that parasite populations evolve to infect a wider range of hosts. This makes sense if you think back to the videos of the evolving interaction networks: after you have a diverse set of hosts, it probably pays off to infect as many of them as possible. I hope that by understanding how this ever-changing network of host and parasite interactions influences selection, we will also get closer to knowing why nestedness is so common in nature.  

For more information about Luis’ work, you can contact him at luis dot zaman at gmail dot com.

Posted in BEACON Researchers at Work | Tagged , , , , | Leave a comment

BEACON Researchers at Work: When Cooperating Means Just Saying No

This week’s BEACON Researchers at Work post is by University of Washington postdoc Brian Connelly.

Just another day in the lab. Making plates with Belen Mesele (l) and Helen Abera (r), two of the people working on the project with me. Our wild-type cooperator strains produce beautiful blue-green colonies due to the production of pyocyanin, another behavior regulated by quorum sensing.

Just another day in the lab. Making plates with Belen Mesele (l) and Helen Abera (r), two of the people working on the project with me. Our wild-type cooperator strains produce beautiful blue-green colonies due to the production of pyocyanin, another behavior regulated by quorum sensing.

Evolutionary biologists often talk like economists, particularly when the topic is cooperation. Instead of dollars, euros, or pounds, the universal currency in evolution is fitness. A species that cooperates cannot survive when competing against a non-cooperative opponent unless the fitness benefits provided by cooperation, such as those resulting from greater access to resources, outweigh the costs. To make matters more complicated, cooperative benefits often take the form of “public goods,” which benefit all nearby individuals, whether cooperator or not. This sets the stage for the emergence of “cheaters,” which exploit the cooperation of others without contributing themselves. Despite cooperation seeming at odds with the notion of “survival of the fittest,” we now have a good understanding of how cooperation can persist in the face of cheaters based on the tremendous work of Fischer, Haldane, Hamilton, Price, and those who have since followed. When the costs and benefits are favorable, and when close relatives are more likely to receive those benefits, cooperation can survive and even thrive.

Environments are always changing, and since the environment plays a dominant role in determining the fitness costs and benefits associated with all traits, natural selection may quickly change between favoring cooperation and not. When the balance shifts so that cooperation becomes more costly than beneficial, cooperators risk being driven to extinction by cheaters or other non-cooperators that do not pay those costs. So how can cooperators survive these tough times? The answer is frustratingly simple—by not cooperating. The challenge, though, is in determining when to cooperate and when to be more self-centered. We humans and other primates are—perhaps very arguably—good at estimating whether or not cooperation will benefit ourselves and those with whom we are similar, either genetically or in our beliefs. We are able to do this by integrating a great deal of information about our world and the people in it. But we are not alone in this.

Surprisingly, it turns out that even relatively “simple” bacteria are extremely effective at determining whether or not to cooperate based on the state of their environment and the composition of their population. One of the ways that these bacteria accomplish this is through quorum sensing. With quorum sensing, individuals communicate with each other by releasing and detecting small molecules, which are used as signals. When an individual detects low levels of the signal, it can use this information to assume either that there are too few other cooperators nearby to produce sufficient benefits by cooperating, or that the public good will be flushed out of the environment before it can be used. However, when that individual detects high levels of the signal, it is likely that there are many relatives nearby that would benefit from cooperation. By communicating this way using signals specific to their own species, bacteria use quorum sensing to rapidly adjust their behaviors to maximize their fitness as the environment changes.

Josephine Chandler recently wrote about her fascinating work that addressed how bacteria use quorum sensing to control the production of antibiotics. While she investigated this process as a means of competing with other species, it can also be viewed as a form of cooperation among members of the same species. By using antibiotics to kill off competitors, sometimes self-sacrificially, more resources become available to those that remain. And because species often have resistance to the antibiotics that they produce, those that remain after an antibiotic attack are likely to be close relatives.

The use of antibiotics is just one example of a behavior controlled by quorum sensing. Since its discovery in the early 1970s, quorum sensing has been observed across a wide variety of species. Among the behaviors regulated by quorum sensing, those related to cooperation and other social interactions are perhaps the most prevalent. Because of this, quorum sensing is believed to play a key role in allowing cooperation to persist in ever-changing environments.

PA01 and lasBaprA Plate

Colonies formed by two of our strains. Through the production of elastase, our cooperators are able to break down the proteins present in this milk agar plate, forming large, clear halos. Our non-cooperator strain does not produce elastase, so it is unable to break down the milk proteins, and a much smaller halo is produced.

Although the connection between quorum sensing and cooperation is now well known, little is understood about how these behaviors became interlinked. To begin addressing this, I am currently working in Ben Kerr’s lab on a number of projects that investigate the co-evolution of cooperation and quorum sensing. To gain a broader picture of this process, we’re pairing microbial experiments with computational and mathematical models.  The cooperative behavior we’re focusing on in our study system, Pseudomonas aeruginosa, is the production of the digestive enzyme elastase. When secreted into the environment as a public good, elastase breaks down large proteins into smaller, usable sources of nutrients available to all cells in the surrounding area. In environments where these large and otherwise inaccessible proteins are the main nutrient source, this behavior is extremely beneficial. (Here is a short video demonstrating growth of our bacteria.)

n these environments, where the proteins are a limited source of resource, cooperators do better due to the benefits provided by elastase. We can measure the amount of cooperation occurring within populations by examining the size of the clearing that occurs when extracting and plating the elastase that is produced. Note: faces add no scientific value.

In these environments, where the proteins are a limited source of resource, cooperators do better due to the benefits provided by elastase. We can measure the amount of cooperation occurring within populations by examining the size of the clearing that occurs when extracting and plating the elastase that is produced. Note: faces add no scientific value.

By exposing our populations to different environments over many generations, we are directly observing how communication and cooperation co-evolve. Through these experiments, we are investigating how quorum sensing enables cooperation to be maintained, the types of environments in which this occurs, and the different ways in which this regulation can occur. We hope that through this work, we can gain a greater understanding of the complex social processes that occur in natural ecosystems and in some of the infections that create tremendous health challenges.

For more information about Brian’s work, you can contact him at bdc at bconnelly dot net.

Posted in BEACON Researchers at Work | Tagged , , , , | Leave a comment

BEACON Researchers at Work: Addressing the Next Generation Science Standards

This week’s BEACON Researchers at Work blog post is by MSU graduate students Melissa Kjelvik and Liz Schultheis.

Data Nuggets logoThe current landscape of K-12 science education is shifting – moving away from memorization of science facts to an approach based on the scientific method where students are taught quantitative skills and how to construct arguments from evidence. These skills are increasingly important as technology increases our access to large pools of data that must be quickly interpreted – including hot science topics in the news, such as evolution and climate change. While teachers support the shift, they currently lack the classroom resources necessary to make the change in their classroom. Additionally, teachers are worried about addressing The Next Generation Science Standards (released April 2013) and preparing students according to the ACT Readiness Standards, as both have increased expectations of analytical and quantitative skills for K-12 students. At present, there is no resource available to teachers that allows them to reinforce these skills repeatedly throughout the school year and continuing grade levels, while also covering core content and hitting on all parts of the scientific process.

BEACON has many overlapping goals with the new science standards, and is well situated to help teachers address their concerns. First, an understanding of evolution depends on a student’s analytical and quantitative skill set. Much disbelief about evolution comes, not from a lack of evidence, but the inability of the audience to understand the scientific process and synthesize evidence to make an argument. Second, a multidisciplinary approach is essential when addressing the new science standards, as quantitative skills must be brought to bear on all science topics and be used as a way of thinking, more than just one unit within the curriculum. Third, once students understand scientific principles, such as the evolutionary process or how to ask questions of the natural world, they will be more excited to pursue a scientific career than if they believe science is purely fact memorization. Students will be able to apply these skills to other careers as well – just as programmers and engineers in BEACON use principles of natural selection to design better software and products. To achieve the goals of BEACON and science standards, teachers need a multidisciplinary and versatile tool that closely resembles the actual practice of scientific research and quantitative analysis.

Introducing Data Nuggets

2We are currently developing a tool that we think has the potential to address these curriculum changes and BEACON goals: Data Nuggets, which bring data collected by scientists into the classroom, thus giving students the chance to work with real data – and all its complexities. Data Nuggets are worksheets designed to help students practice interpreting quantitative information and make claims based on evidence. The standard format of each Nugget provides a brief background to a researcher and their study system along with a small, manageable dataset. Students are then challenged to answer a scientific question, using the dataset to support their claim, and are guided through the construction of graphs to facilitate data interpretation. Various graphing and content levels allow for differentiated learning for students with any quantitative or science background. Because of their simplicity and flexibility, Data Nuggets can be used throughout the school year and teachers can provide higher graphing levels as students build confidence in their quantitative skills.

Data Nugget History

1Utilizing the unique teacher-graduate fellow partnership organized by the Kellogg Biological Station’s GK-12 program, Data Nuggets were created by graduate students in response to discussions with Michigan teachers who expressed concern about students’ ability to make claims based on evidence. When first designing Nuggets, GK-12 fellows held a teacher workshop at KBS to solicit feedback on the structure, organization, and content to make Data Nuggets a teacher and classroom friendly resource that could be used at all grade levels. Teacher feedback continues to be an invaluable component to the development of the Nuggets as we travel to conferences such as ESA Life Discovery and National Association of Biology Teachers. More recently, the Nugget network has expanded beyond GK-12 to include datasets from graduate students, faculty, teachers, and undergraduate researchers at KBS.

The Future of Data Nuggets: Integration of BEACON research

For the next year, we will be supported by BEACON funds to address both the challenges BEACON researchers face when communicating evolution to broad audiences and the lack of education resources available for teachers to teach quantitative skills. Utilizing BEACON’s network and resources we are excited to:

1) develop and implement an assessment tool documenting the ability of Data Nuggets to improve students’ quantitative skills and understanding of science

2) provide professional development for BEACON researchers by facilitating workshops at each institution to create Data Nuggets from their research

3) enhance accessibility of Data Nuggets by creating a user-friendly website and presenting Data Nuggets at teacher conferences.

Coming to a BEACON University Near You: Data Nugget Workshops

We anticipate Nuggets will be a popular tool for academics to share their research with broad audiences. The short, simple Nugget template facilitates the creation of additional worksheets by making the process quick and easy for faculty and graduate students in all disciplines. Researchers who create Nuggets will improve their science communication skills, important when giving talks, writing papers submitting grants. Additionally, graduate students involved in BEACON can make Nuggets on findings from multidisciplinary collaborations, such as connections between evolution and engineering.

We will be organizing workshops at BEACON-affiliated universities to provide the training necessary for BEACON researchers to create a Data Nugget of their own. We will walk through the basic components of the Data Nugget and provide feedback as to the appropriateness of their Nugget for specific grade levels. Additionally, we will reach out to K-12 teachers at schools near each institution to increase awareness of Data Nuggets and invite them to make Data Nuggets of their own.

3“As we get our students ready for ACT testing, data nuggets are wonderful sets  to use in our classroom  because they are relevant and introduce “real” research to our students whom might not have this type of exposure otherwise.”  ~ Marcia Angle, Lawton Middle School

For more information about this

project, you can contact Melissa at kjelvikm at msu dot edu. 

Posted in BEACON Researchers at Work, Education | Tagged , , , , , | Leave a comment

BEACON Researchers at Work: Multi-objective Evolutionary Optimization to Allow Greenhouse Production/Energy Use Tradeoffs

This week’s BEACON Researchers at Work blog post is by MSU graduate student José R. Llera.

JoseMy name is José R. Llera, and I received my B.S. in Computer Engineering from the University of Puerto Rico at Mayagüez. I learned about the BEACON center and their research during a visit to MSU. The study of evolutionary computation caught my interest, and I’ve been studying it at MSU ever since as a PhD student. One of the things that interest me the most about this field of study is that evolutionary computation is multi-disciplinary by nature, and you get to work with passionate people who are experts in their respective fields on an almost daily basis. This has given me the opportunity to learn exciting things from areas that would normally be outside my scope, and it opens many possibilities in solving difficult engineering problems.

One particularly exciting project was introduced to me by Dr. Goodman involving greenhouse optimization. The main motivation behind this project lies in the growing global demand for fresh vegetables, and greenhouse innovation is a hot topic for helping meet this demand. In particular, China has drafted ambitious plans to design and build a new generation of greenhouses, helping to supply its year-round needs for fresh vegetables in a way that is economically viable and environmentally friendly, as stated in its No. 1 central document of 2012, which underscores the importance of scientific and technological innovation for sustained agricultural growth.

This led to collaboration with many members from inside and outside MSU. I’m currently working directly with Dr. Goodman, Dr. Prakarn Unachak (a post-doctoral researcher at BEACON), and Chenwen Zhu (a graduate student from Tongji University, currently at MSU) in developing a system which can simulate a greenhouse environment, as well as applying evolutionary algorithms to obtain an optimized strategy for greenhouse control. An experimental greenhouse is being built in Tongji University, which is located in Shanghai, China. We expect the new models and control methodologies we are developing will be parameterized and validated at this facility. Dr. Goodman, Zhu and I visited that greenhouse in late 2012, giving us a first-hand look at its construction and operational capabilities. 

A greenhouse is a complex system with interacting parameters like crops, facilities, climate and cultivation patterns, etc. How to coordinate these parameters for an efficient, productive and ecologically-safe greenhouse with a relatively optimal growing environment at the least cost has always been a research hotspot in the horticulture field. However, meeting these requirements is not trivial since in practice it’s difficult to achieve these things due to the complexity of a greenhouse environment. Our team is currently working on using a form of multi-objective evolutionary optimization (“Multi-Objective Compatible Control”, or MOCC) to allow dynamically balancing the needs for crop production vs financial and environmental costs associated with operating a greenhouse. 

As for the “compatible control” part in MOCC, it is a hierarchical control strategy which takes advantage of the nature of greenhouse systems. Compromises become possible when some flexible parameters, if any, of the production system are relaxed, without damaging the whole system performance in a long term view. Such quantitative trade-offs could be made between either the economic return of the crop, the cost of maintaining and operating greenhouse facilities or the control precision of actuators in a comparatively short run.

With enough information on the greenhouse environment it’s possible with the proposed method to obtain a set of operating points, each of which is non-dominated (in the Pareto sense) by the others in the set in terms of the objectives given. That is, none of these points is better with respect to all objectives than any other point in the set. Such a set is known as a Pareto set, and is the end result of many multi-objective optimization algorithms, including MOCC. In the limit, a sequence of such Pareto sets, as more and more points are tested, is the Pareto Front, a curve along which no improvement is possible in any objective without sacrificing performance in some other objective.

Example Pareto set. Each axis represents an objective, and individuals are optimized to be as close to the origin as possible.

Example Pareto set. Each axis represents an objective, and individuals are optimized to be as close to the origin as possible.

The best way to encapsulate, express, and implement a greenhouse system is via mechanistic models that describe the dynamics of the climate and the crop. The approach for this project uses evolutionary techniques for the optimization aspect of the problem, and given the random nature of the evolution, the final mechanistic model is must be robust enough to deal with all possible scenarios covered in the evolution process.

Typical greenhouse and variables used when modeling

Typical greenhouse and variables used when modeling

Ph.D. student Bram Vanthoor of Wageningen UR Greenhouse Horticulture has developed a mathematical method for designing greenhouses that are better adapted to local conditions. We’ve found that this model is fairly comprehensive, versatile, and also tested and proven in real settings. This method has been tested for the Netherlands, Spain and USA. Although implementing all the details in Vanthoor’s model has turned out to be computationally expensive, it’s flexible enough to be adapted to our needs so that it runs with reasonable speed and accuracy.

Cucumbers and lilies are the final target crops for the experimental greenhouse. However, tomatoes are currently selected as the model crop since tomato is one of the most widely produced greenhouse vegetables in the world and knowledge about tomato yield modeling is well established. Once we have developed suitable greenhouse models and control strategies using tomatoes, it should not be too difficult to adapt them for other crops.

Currently, Dr. Unachak has been working on simplifying and speeding up the greenhouse model, and the running time for a complete growth cycle has been reduced to reasonable levels for testing evolutionary algorithms. Zhu and I are currently working on determining most appropriate aspects for optimizing our greenhouse control using NSGA-II, a type of multi-objective evolutionary algorithm. We will be able to run NSGA-II on top of the greenhouse model soon, and results that perform well could be used in the experimental greenhouse in the near future.

For more information about José’s work, you can contact him at lleraort at msu dot edu.

Posted in BEACON Researchers at Work | Tagged , , , | Leave a comment

How and why do animals evolve grouping behavior?

This blog post is reposted from MSU graduate student Randal Olson’s blog.

In the concluding remarks of their book Living in Groups, Jens Krause and Graeme Ruxton highlighted “understanding how and why animals evolve grouping behavior” as one of the major topics in animal grouping behavior research that would benefit from further study. Indeed, grouping behaviors are present in animals across all taxa, ranging from the microscopic bacteria to the gargantuan humpback whales. Some scientists even believe that part of the reason humans evolved such a high level of intelligence is because they lived and interacted in groups for hundreds of thousands of years. Yet, despite the omnipresence and apparent importance of grouping behaviors, we are only now beginning to understand the mechanisms underlying these behaviors. How and, perhaps more importantly, why do animals live in groups?

Animals of all shapes and sizes live in groups, yet we’re only now beginning to understand why. Picture credit: Bidgee

Animals of all shapes and sizes live in groups, yet we’re only now beginning to understand why.
Picture credit: Bidgee

Here at the BEACON Center for the Study of Evolution in Action, we specialize in looking at life from an evolutionary perspective. By taking such a perspective, we’ve made a number of incredible discoveries that would not have been possible if we didn’t consider evolution as an important force shaping all forms of life around us. As such, I found the concluding remarks of this book particularly interesting, and worthwhile to elaborate upon.

Thinking about grouping behavior in an evolutionary context

In the preceding chapters of the book, Krause and Ruxton outlined many of the leading hypotheses explaining the costs and benefits of grouping behavior. Interestingly, the authors cautioned their readers that although these benefits certainly seem plausible, the hypotheses only establish “that grouping behavior would be advantageous under certain ecological conditions,” but “do not address the actual selection mechanism” that could select for grouping behavior.

This statement addresses one of the major pitfalls that scientists run in to when thinking about traits in an evolutionary context. Thus, the authors found it important to clarify that just because a phenotypic trait (e.g., behavior, morphological feature, etc.) is beneficial under certain ecological conditions, it does not necessarily mean that the benefit is sufficient to select for that trait over evolutionary time. Both the benefits and the costs of the trait must be considered.

The benefits of grouping behavior must outweigh the costs for grouping behavior to be viable on an evolutionary scale. Picture credit: winnifredxoxo

The benefits of grouping behavior must outweigh the costs for grouping behavior to be viable on an evolutionary scale.
Picture credit: winnifredxoxo

With this fact in mind, the authors asked researchers to establish experimental systems that can directly test the various hypotheses attempting to explain how grouping behavior evolves. Mind you, Living in Groups was published in 2002, so I fully expected there to be a ton of research in this area by now. Yet, much to my surprise, I only found a handful of papers broaching the subject. I’ve listed the papers I’ve found so far below.

If I’m missing any papers, please email me or leave a comment and I’ll update the list.

List of papers directly testing evolution of grouping behavior hypotheses
Author(s) Title Year Hypothesis
Christopher R. Ward, Fernand Gobet, and Graham Kendall Evolving collective behavior in an artificial ecology 2001 foraging and predation
Timothy C. Reluga and Steven Viscido Simulated evolution of selfish herd behavior 2005 selfish herd theory
Andrew J. Wood and Graeme J. Ackland Evolving the selfish herd: emergence of distinct aggregating strategies in an individual-based model 2007 selfish herd theory
Colin R. Tosh Which conditions promote negative density dependent selection on prey aggregations? 2011 dilution effect
Christos C. Ioannou, Vishwesha Guttal, and Iain D. Couzin Predatory Fish Select for Coordinated Collective Motion in Virtual Prey 2012 dilution effect
Randal S. Olson,
David B. Knoester, and Christoph Adami
Critical interplay between density-dependent predation and evolution of the sel fish herd 2013 selfish herd theory
Randal S. Olson et al. Predator confusion is sufficient to evolve swarming behavior 2013 predator confusion effect

One interesting thing to note here is that all of these papers use some form of digital model to directly test the hypotheses. Why is that?

Why can’t we just use biological model systems?

As it turns out, evolving behavior in biological model systems is hard.

  1. Evolution of behavior in biological model systems takes a long time. Krause and Ruxton suggested that the best biological system to study the evolution of grouping behavior in could produce four to five generations per year. At that rate, how many years would it take to evolve grouping behavior in a species that initially does not form groups? Even with an optimistic estimate of three years (from the book), that represents a significant amount of time to run an experiment that may very well be a dud.
  2. Evolution of behavior in biological model systems is difficult to control and manipulate. Anyone who has worked with live animals knows how hard it is to experimentally control every factor in the experiment. Now imagine you want to test a specific form of selection on your experimental population, such as the predator confusion effect, without any confounding effects. I feel bad for the graduate student who gets assigned that project!
  3. Evolution of behavior in biological model systems is difficult to measure. How do you quantify “groupiness” in a biological system? There have been a few impressive approaches to measuring grouping behavior in biological systems, but they always involve time-intensive video recording and analysis. Now imagine running these video analyses on an evolutionary scale, every generation, for multiple years. I just exhausted myself by merely thinking about such an endeavor.

With these complications in mind, it shouldn’t be so surprising that we don’t see many biological model systems for studying the evolution of grouping behavior. The hypotheses explaining grouping behavior have yet to be refined enough, and refining them in biological model systems is far too expensive (both time- and resource-wise).

Digital evolutionary models as experimental test beds

In the past two decades, we’ve seen digital evolutionary models such as Avida transform into powerful experimental test beds for studying core evolutionary processes. Researchers have used these models to refine our understanding of how evolution works (e.g., how complex traits evolve), and even to make fundamentally new discoveries (e.g., survival of the flattest).

As Randall Beer aptly put,

The early theoretical development of a field typically involves the careful study of simpler idealized models that capture the essential conceptual features of the phenomena of interest. Such model systems have a long history in physics. For example, it was not until Galileo’s consideration of such idealized situations as frictionless planes that theoretical physics in the modern sense of the word really began.

The power of such an idealization is that it simultaneously makes clear a deep principle of motion (acceleration, not velocity, is proportional to force) and provides a well-defined way in which the complicating effects of friction can be understood (as an external force acting on the system).

Instead of frictionless planes, we need frictionless brains.

In essence, digital evolutionary models such as EOS provide the “frictionless brains” for understanding the mechanics underlying the evolution of grouping behavior. They provide a test bed to rapidly prototype and refine our hypotheses before we conduct the expensive experiments in biological systems, thereby saving countless amounts of work and money. I may be preaching to the choir here, but I feel it’s important to say: It’s time for grouping behavior researchers (and biologists as a whole) to abandon the antiquated notion that digital models can’t tell us anything about natural processes.

An evolved digital swarm from the EOS platform

Hybrid digital/biological model systems

Although I’m obviously critical of using biological model systems for early hypothesis testing and refinement, there has been a recent movement to merge biological and digital systems that I feel is worth mentioning. Particularly, a more recent approach coming out of Iain Couzin’s lab has shown exceptional promise. In this experiment, Ioannou et al. projected virtual prey onto the side of a fish tank and had live predatory fish “feed” on the virtual prey. As the title of the paper suggests, the predator’s feeding preferences selected for grouping behavior in the prey after several simulated generations. Effectively, this experiment demonstrated one method by which predation can select for the evolution of grouping behavior in prey with a real predator. How impressive is that? (Although, to make a small comment on the experiment: It only worked because grouping behavior was present in the population from the beginning, and as such does not yet explain how grouping behavior arises in the population in the first place.)

Video demonstration of Ioannou et al.’s hybrid model system

These hybrid model systems seem to capture the biological complexity of traditional biological systems, while still retaining many of the advantages offered by digital systems. I became so enamored by the idea of hybrid systems that I put together a proposal to build such a hybrid system myself. If only I could get it funded!

So, where does that leave us?

Understanding the mechanisms underlying the evolution of grouping behavior is going to be a difficult yet enlightening line of research for grouping behavior researchers. Digital evolutionary models have shown promise of expediting this line of research by establishing a strong basis in theory before the experiments in biological systems proceed. Although digital evolutionary models are unlikely to make exact quantitative predictions about how grouping behavior evolves (e.g., “the predator confusion effect confuses the predator 50% of the time when 10 prey are in the group”), they will allow us to make qualitative predictions about the effects of various selection pressures (e.g., “the predator confusion effect is sufficient to evolve grouping behavior”). Once a strong basis in theory is established, we can move forward into testing our hypotheses in hybrid digital/biological model systems and eventually fully biological model systems to relax the assumptions of our models. In the meantime, however, this line of research would benefit most from concentrating on refining theory in digital evolutionary models.

You may leave comments for Randy on the original post.

Posted in BEACON Researchers at Work | Tagged , , , , | Leave a comment

Bacterial warfare using antibiotics and communication

This week’s BEACON Researchers at Work post is by University of Washington research assistant professor Josephine Chandler.

Bacteria can compete with one another by making antibiotics

Cartoon version of microbial warfare. An ant (in red) grows a colony of bacteria (in blue) that can blast away another microbe (yellow) with antibiotics. Image courtesy of Dr. Jake McKinlay, a former PhD student at Michigan State University and now a Professor of Microbiology at Indiana University (http://www.indiana.edu/~mckinlab).

Cartoon version of microbial warfare. An ant (in red) grows a colony of bacteria (in blue) that can blast away another microbe (yellow) with antibiotics. Image courtesy of Dr. Jake McKinlay, a former PhD student at Michigan State University and now a Professor of Microbiology at Indiana University (http://www.indiana.edu/~mckinlab).

Competition occurs all around us, between people and institutions, and in plants and animals. In nature the battle for limited resources and space can be a fight for survival. Those individuals that have the skills necessary to win have an advantage and survive. Those that do not may die. Thus competition can be a strong influence in driving a population to change or evolve new traits or better strategies to win.

Bacteria also compete with one another using an array of destructive compounds and strategies. Antibiotics, typically associated with their medicinal properties, are actually made by bacteria and other microbes and used as a weapon during competition. Antibiotics were first discovered by a scientist named Alexander Fleming. They were discovered by accident, after Dr. Fleming’s return from a month-long holiday with his family. Dr. Fleming had stacked up all of his experiments in a corner of his laboratory before leaving, and on his return he observed that one of the plates containing the bacterium Staphylococcus aureus had been contaminated with another microbe that seemed to secrete a compound that could destroy the Staphylococcus. Later he confirmed that the secreted substance, which he called penicillin, could kill a number of different kinds of bacteria. Eventually penicillin was mass-produced and used as an antibiotic to treat people, and its discovery is now said to have changed the face of medicine (Alexander Fleming was named one of Time magazine’s 100 most influential people of the 20th century). Not long after the discovery of penicillin, a number of other microbially produced antibiotics were discovered and developed for medical use. It was this story of the discovery of penicillin that first lured me to the field of microbiology, during an undergraduate course at the University of Iowa.

Paper filter disks saturated with antibiotics cause growth inhibition on a plate spread with the bacterium Staphylococcus aureus. Image from phil.cdc.gov.

Paper filter disks saturated with antibiotics cause growth inhibition on a plate spread with the bacterium Staphylococcus aureus. Image from phil.cdc.gov.

Antibiotic production and quorum sensing

he anglerfish has an appendage off the tip of its head that glows because of bacteria that use quorum sensing to control production of light. The bacteria allow the anglerfish to attract curious prey in the deep ocean darkness. Image from http://si.wsj.net.

The anglerfish has an appendage off the tip of its head that glows because of bacteria that use quorum sensing to control production of light. The bacteria allow the anglerfish to attract curious prey in the deep ocean darkness. Image from http://si.wsj.net.

A single bacterium probably cannot produce enough antibiotics to kill other bacteria (see Mlot C. Science 324:1637-1639). However many bacteria have evolved a way to ‘count’ themselves using a system called ‘quorum sensing.’ Quorum sensing involves communication between bacteria using small diffusible signals. Antibiotic production is energy-expensive and it is thought that quorum-sensing control may reduce the overall expense by delaying production until there is a sufficient population to produce a killing antibiotic dose. Thus quorum sensing may provide a winning edge during competition. Bacteria are constantly competing with each other in the environment and this may have influenced the evolution of quorum sensing systems in bacteria.

Using laboratory experiments and mathematical approaches to understand quorum sensing

Our laboratory is interested in the connection between quorum sensing, antibiotic production and competition among bacteria. To examine these connections I created an experiment in the laboratory using two soil bacteria, Burkholderia and Chromobacterium, that each use quorum sensing to regulate antibiotics. I first showed that the antibiotics produced by each species can inhibit growth of the other. Next, I used bacterial mutants that don’t quorum sense to show that in each case the ability to compete is enhanced by quorum-sensing regulated antibiotics.

Next I was interested in understanding the benefits of using quorum sensing to regulate the antibiotics during competition. To do this I teamed up with a physicist and a mathematician and devised a mathematical model of our laboratory system. The model used a series of differential equations that allowed us to vary parameters for bacterial growth, antibiotic-induced death, and quorum control of the antibiotics. We also incorporated a cost associated with antibiotic production. Using this method, we could change various aspects of the system and examine the different outcomes. For example, we could determine the effects on competitiveness when we made antibiotic more costly to produce. We found that the bacterial population competed better when antibiotics were produced at high density (vs. low density), such as by quorum sensing regulation. This was more pronounced when there was a large cost associated with antibiotic production. Thus our mathematical model supported the idea that quorum-sensing regulation of antibiotics enhances the ability to compete by sparing the cost of antibiotic production until there are ‘enough’ bacteria present.

By using a combination of laboratory and mathematical approaches we have begun to study important questions about how quorum-sensing systems are important for bacterial competition. This work emphasizes how laboratory and mathematical approaches can be useful to study evolutionary questions and better understand the selection pressures that influence important bacterial processes, such as the production of antibiotics.

For more information about Josephine’s work, you can contact her at jchan4 at u dot washington dot edu. 

Posted in BEACON Researchers at Work | Tagged , , , , | Leave a comment