Recruiting scientists for industrial jobs

I’m passionate about recruiting. There are few decisions that a manager makes that are more important than hiring decisions. Your philosophy and standards for recruiting are a key driver for the improvement of the capabilities and long-term growth of your group. As my boss recently told me, “The talent of your staff is one of the caps on your performance.”  I started working with our internal recruiting team in 2007 and have never stopped since. The main reason is that I see it as a key responsibility to the organization. The great fringe benefit, though, is getting to know a large group of incredibly talented students and an only slightly smaller group of incredibly talented professors.

Back in March, I had the privilege of being on an impromptu panel on careers in industry for physicists at the APS meeting in Baltimore. What struck me was that my colleagues from Dow, DuPont, and other companies had very similar views about recruiting to what I had learned. The students in that session asked us great questions about how to get hired for an industrial position. Inspired by this, I wrote a short piece on the plane ride home, which the great folks at Physics World published in the September 2013 issue.

I’m hoping that in the near future, my colleagues at Science on Google+ and I will be hosting a Hangout on Air for science and engineering students at all levels to ask a panel of industrial scientists questions about their careers. Keep an eye out for that in the near future.

Who can keep up with the milestones in scientific computing?

A decade ago, when I was a researcher in the field of computational physics, my graduate advisor had grants that allowed us access to fairly large supercomputers. The computers I used for my research were nowhere near as fast as the new petaflop-class supercomputer built at Indiana University. Ars Technica just recently published a quick look at this machine, which is the first petaflop computer at a university.

At that time, the big iron was at places called MSRCs – Major Shared Resource Centers. You had to pay for time on those machines – and for many of them, get security clearance, since they were owned by the military. (For example, my work was mostly done at the MSRC at the Naval Oceanographic Office.) By the time I graduated, universities were just starting to build their own in-house supercomputers.

Back then, supercomputers were used to perform fairly specialized types of calculations. I was doing molecular dynamics calculations – essentially solving Newton’s 2nd Law (F = ma) in its differential form for a quarter of a billion atoms simultaneously. Other folks were doing climate modeling – a different set of differential equations, or modeling processes for the Strategic Stockpile Stewardship initiative. (i.e., nuclear weapons.) These days, there are a host of bioinformatics applications that can use this horsepower and even more social science applications that all fall loosely under the heading of Big Data. With each increase in the number of floating point arithmetic operations per second (FLOPs), the size and complexity of problems that can be tackled also increases.

Despite Ars Technica’s flashy title for their piece, Indiana University certainly isn’t the first to build an in-house supercomputer, and certainly not the first to get plaudits for having a Top 25 supercomputer. My own alma mater was briefly the owner of a Top 10 supercomputer, in the 2002-2003 timeframe. They are just now installing the 3rd generation of that system, which clocks in at 146 teraflops, or about 10% of the IU machine’s speed.

While IU has the fastest supercomputer in a public university now, the rate of progress here will quickly eclipse their computer. In the relatively short span of 11 years, the fastest computers in the world went from about a teraflop to about a petaflop. That’s almost a hundredfold increase in speed per year. But the pace of progress doesn’t change the fact that with the increase in computing power at their disposal, researchers at IU now have the capability to solve these larger, more complex problems. And they can do it without having to get a grant of either money or computing resources. That’s a big deal for the progress of science, even if the superlative is fleeting.

2013 Wolf Prizes

The coming of the new year always heralds a new group of Wolf Laureates. The Wolf Prizes were established in 1976 to “promote science and art for the benefit of mankind.” Since their establishment, the Wolf Prizes have become an important award recognizing significant contributions to various fields of scientific endeavor. In the physics world, the Wolf Prize is often a stepping-stone on the way to an eventual Nobel.

This year’s Wolf Prize in Physics was split between Juan Ignacio Cirac and Peter Zoller for their work in quantum optics and quantum information theory. Particularly cited was their joint work in 1995 on a practical model of a quantum computer implemented with trapped ions. This model has been the basis for much of the current experimental work in quantum computation.

It’s hard not to see the significance of this. Much in the same way Peter Higgs’ proposal of a supermassive boson led eventually to the recent successes at CERN, Zoller and Cirac’s work paved the way for not only quantum computation – a field that is both intellectually exciting and perhaps eventually of great practical use – but laid the path to a much deeper understanding in one of wildest frontiers in the field of physics.

The Wolf Prize in Chemistry was awarded to Robert Langer for his work on biodegradable polymeric materials for drug delivery. I have to admit I did a double take when I read this because, after all, drug delivery has been a ubiquitous rationale in materials science for the past 15 years. Everyone is doing drug delivery! Whole academic departments have reorganized themselves around biomaterials, with drug delivery being a major focus. But when you consider that Langer’s work pioneered the area, it is unsurprising that he is being recognized with this award.

I particularly appreciate that the Wolf Foundation awards a prize in Agriculture. I think that we underestimate the importance of agricultural science to world at large, even with the attention being paid to it these days. What I found very interesting is that Jared Diamond, the author of Guns, Germs, and Steel, shared the prize for his research into the relationship between human society and agriculture. Diamond, a 1990 MacArthur Fellow, also received the National Medal of Science for his work in this area. The gentleman with whom he shared the prize is Joachim Messing, who has made great strides in crop plant genomics, including establishing the Rutgers Plant Genome Initiative.

While I’m not going to dig too deeply into them, there was also a joint prize awarded in Mathematics to two American mathematicians, Michael Artin and George Mostow, for their contributions to geometry and Lie group theory as well as a prize awarded in Architecture to Eduoardo de Mouro, a Portuguese architect.

Visit to the NC Museum of Natural Sciences

This weekend, the entire family packed up and headed down the mountain for a visit to the NC Museum of Natural Sciences. I felt like I needed to share some of this experience with the world at large, because if you’re not familiar with the NCMNS, you are missing something special. For years, the museum was a solid and fun experience for myself, my partner, and the geeklings, the latter of which especially enjoy the skeleton of Acrocanthosaurus. With the recent completion of the Nature Research Center, the museum has become a real treasure. We made our second visit to the NRC and our first visit to the Museum’s “backyard,” the Prairie Ridge Ecostation. I’ll talk more about both places below the break.

Continue reading

Getting into the cell membranes of antibiotic-resistant bacteria

The growing prevalance of highly-resistant bacterial strains may be one of the key public health issues of this century. What is important to note is that most, if not all, of our current antibiotics are based on defense mechanisms that already exist in nature. The converse is also true: most bacteria already have genes that will confer some resistance to these defenses and our use (overuse?) of antibiotics derived from these defenses has selected for strains of bacteria that express these resistant genes. Researchers are currently measuring the cell membranes of bacteria to understand how to design better antibiotics that can treat infections by antibiotic-resistant bacteria. Some of this work was recently highlighted on Phys.org

Think about the last time you got an antibiotic. Did it end with -cillin? (Penicillin, methicillin, ampicillin, etc.) These are common class of molecules that were derived originally from the Penicillium mold. The chemistry they use to do their business is all largely the same; the primary differences between them are in the little bits attached to the core of the molecule that helps it get to where its supposed to be.

Did your antibiotic begin with Cef- or Ceph-? (Cephalosporin, Cefotaxim, etc.) These molecules have the same business end as the -cillins, but are derived from a different fungus.

What about -cycline (Tetracycline, doxycycline, etc.)  or -mycin (Vancomycin, neomycin?) These are all derived from the Streptomyces genus of bacteria, many species of which are common soil bacteria. The chemistry of these antibiotics is completely different from the penicillin classes or cephalosporin classes, which is one reason why they are often used as the “last lines of defense.”

What’s important and exciting about the cell membrane studies described in the link is that they hold the promise of helping us to develop antibiotics that are not based on natural compounds, which could likely make it more difficult or at least slower for pathogenic bacteria to develop a resistance to them.

It is interesting to me that over the recorded history of humankind, we have actively and aggressively intervened in the genetics of every kingdom of life to adapt them to our benefit. Every kingdom, that is, except Bacteria. In the case of the bacteria, we have done the opposite – largely adapting them to become less beneficial, and most of that intervention has been in the last century. There are lessons to be taken from this.

Nobel 2012: Chemistry

This year’s Nobel Prize in Chemistry goes to Robert Lefkowitz and Brian Kobilka for their study of how cells interact with their environment. Everyone has heard of various “receptors.” The term has pretty well entered the common lexicon, and much of that is because a large quantity of modern drugs interact with these receptors, called G-protein coupled receptors.

What’s exciting about this work is not that it is a story of how science evolves and deepens over the years, even though it is: from the late 60’s when Lefkowitz enters the picture, through the discovery of the G-proteins (which itself earned Gilman and Rodbell the 1994 Nobel Prize in Physiology or Medicine,) through Kobilka’s discovery of the gene that encodes for a particular receptor of interest – the beta-adrenergic receptor. Any of you who are taking beta-blockers for hypertension are tinkering with the function of this receptor on a daily basis. This long-running line of research recently culminated in Kobilka’s recent publication (Nature 469, 175) of the crystal structure of an activated beta-adrenergic receptor. As the Nobel committee so eloquently put it , this was “a molecular masterpiece – the result of decades of research.”

Rather, what is exciting was Kobilka’s observation that the structure of this beta receptor was substantially similar to the rhodopsin detector in the eye. The implication of Kobilka’s work was that the body was able to use similar structures to accomplish widely different tasks and that at least our ability to see shares a common evolutionary root with other wholly internal sensing mechanisms. It has undoubtedly had and will continue to have a great deal of impact on our search for new pharmaceutical therapies but also on our understanding of the complex biomolecular machinery of life.

As amazing as this work is, I can’t help but feel a bit conflicted about this Nobel Prize. Understand that this has nothing to do with the research or the researchers, but rather is about the choices of the Nobel committee. This award feels like an underhanded way of awarding a second Nobel Prize in Physiology. This may be unfair on two counts. First, there is a long track record of biochemistry receiving well-deserved chemistry prizes for chemical research. The classic example of this was Frederick Sanger, who won the 1958 Nobel Prize in Chemistry for the structure of proteins, particularly insulin, and then shared the 1980 prize for his work on nucleic acid. The work, while on biological substrates, was clearly well ensconced in the fields of chemical research.  Second, it is undeniable that certain frontiers of science are in the realm of biological systems and that the research that is timely and novel right now will inevitably have some biological bent. What bothers me is that the motivation of the decades of research was not in my mind the chemistry underlying the receptor, but rather the physiology of the receptor’s behavior and its mechanism of action. Further, the capstone piece of work cited by the Nobel committee was the Nature paper mentioned earlier which dealt with the structure of an activated receptor, which calls Watson and Crick’s 1962 Nobel Prize to mind. It is of note that this work received the Nobel Prize in Physiology or Medicine.

Many chemists struggle with the “bio-creep” of the prizes, which is probably inevitable and a little healthy.  I will also admit to have given a little friendly joshing to some of my colleagues who have complained about it. But in this particular case, I share the sentiments expressed at Chembark that this was excellent work deserving of the Nobel Prize in Physiology or Medicine.

(Update: Derek Lowe has written a very well thought out post about this topic as well.)

Nobel 2012: Physics

This year’s Nobel Prize in Physics was awarded to Serge Haroche and David Wineland for their work in the experimental measurement of quantum systems. The official announcement might make the area of their work seem rather vague and a cursory glance at the two laureate’s work could lead someone to believe that this award was for work in a similar, but somewhat unrelated area. This is most certainly not the case. The work of Haroche and Wineland has enabled experimental investigation of quantum computing. The field of quantum computing was itself postulated in 1982 by another Nobel physics laureate, Richard Feynman, and has the potential to change the way we think about computing and the processing of information.

Dr. Haroche’s groundbreaking work, published in 1996, showed experimentally that quantum systems “leaked” information to their environment, which was heralded as the explanation for wavefunction collapse, and the implications for that decoherence on the measurement of the quantum state. To simplify it, think back to the classic gedaken experiment about Schrodinger’s cat. In this experiment, the animal’s survival is predicated on the decay of a single atom, which is a quantum event. We then say that the animal must be in a superposition of the two states labelled “alive” and “dead.”  Quantum decoherence is the process by which the observed wave function collapses into a classical state, or in the cat-model, how the system falls back into the cat being either alive or dead. Haroche’s key paper (Phys. Rev. Lett. 77, 4887) provided key insight into this process of measurement and posited that quantum decoherence is the boundary between classical and quantum behavior that is reached when the quantum system interacts with its environment.

Dr. Wineland, both a Rabi Award and National Medal of Science winner, used quantum systems of trapped ions to perform computations, pulling together and building on the Nobel prize winning work of Wolfgang Paul (ion trapping, 1989), Steven Chu, Claude Cohen-Tannoudji, and William Phillips (laser cooling of atoms, 1997) and Haroche’s work to show that entangled quantum systems could be made to process information and then that the processed information could be retrieved reliably. While quantum computing is in a relative infancy, Wineland’s early work in the area also enabled more precise atomic clocks and made the global positioning system possible.

Nobel 2012: Physiology/Medicine

The first announcement of this year’s Nobel season was made this morning. The Nobel Prize for Physiology or Medicine has gone to John Gurdon and Shinya Yamanaka. This is a case where the work being awarded is very well known, even outside of scientific circles. Induced pluripotent stem cells, colloquially known as “adult stem cells,” have been discussed in many forums for their potential for revolutionary medical treatments.

What’s exciting about this work is that this discovery is key to enabling future therapeutic successes of stem cells. One such therapeutic application is one that I have long considered to be a Holy Grail problem: the growth of complex organs for transplants. Currently, there have been some early successes in this area, particularly in growing relatively simple organs such as bladders (first performed in 2006) and tracheae (first clinical use in 2008,) we are quite far away from organs such as livers, hearts, and kidneys. Both of these tissue engineering successes have used a type of stem cell that can be isolated from a patients own bone marrow, rather than the induced pluripotent cells that have won this year’s Nobel. It is expected that Yamanaka’s work will be key to developing more complex organs.
From the Nobel perspective, this work should have been on anyone’s short list of potential winners. Yamanaka shared last year’s Wolf Prize in Medicine with Rudolf Jaenisch of MIT for this discovery and Jaenisch’s subsequent use of the technique to treat a genetic disease in a mammal, thus providing a proof of concept for its therapeutic use. What is interesting is that the Nobel committee did not choose to include Jaenisch in today’s award.
Induced pluripotent stem cells have a long history. Gurdon’s work in replacing the nuclear material of a frog egg cell with the nuclear material from a tadpole was originally performed in 1962. While this success would not appear to deal with stem cells, the learning that a mature, differentiated cell could be reverted to an immature state was key to Yamanaka’s later work.  What Yamanaka did was to find the specific gene sequences necessary to revert a mature cell to a stem cell, and then transfecting adult cells in order to force the expression of those sequences.
Inveterate geeks will recall the scene in Star Trek 4 where Dr. McCoy gives a dialysis patient a pill that induces the growth of a new kidney. Induced pluripotent stem cells may bring us closer to those kinds of therapies are available outside of the movie theater.
Coming tomorrow: the Physics Nobel. We’ll see if my earlier predictions are accurate.

The 2012 MacArthur Fellows

The MacArthur Foundation has announced the names of the 2012 MacArthur Fellows! This year’s group is a pretty exciting bunch. I was interested to see the number of folks on the list working at the edge of art, science and culture, including Uta Barth and Maurice Lim Miller. The two fellows that interest me the most, mainly based on their area of expertise, are Maria Chudnovsky and Sarkis Mazmanian.

Dr. Chudnovsky is a professor of operations research and mathematics at Columbia University and studies graph theory. I’ve seen a lot of very interesting papers in the past few years where the specific tractability of analysis that you get with graphs has been used to elucidate phenomena from failovers on communications networks to growth dynamics in social media. Dr. Chudnovsky’s work is fundamental in connecting the specifics of graph theory to other branches of analysis.

Dr. Mazmanian is a professor of biology at CalTech. Regular readers of this blog and my Google+ stream will understand why I’m excited about this guy. His area of study is the interaction between host organisms and their beneficial microbial symbiotes. This area of research and the underlying premise that at least in humans, we can treat our bodies as an ecosystem rather than an organism promises to shape medical research outcomes for the next half-century.

I’m grateful to Ed Darrell for breaking the news when I was sound asleep!

The thrill of setting a stake in the ground

I ran across a great preprint while I was plundering the literature to write a presentation on bio-sourced materials last week. (I had posted an image on G+ from that presentation of some biologically-assembled nanofibers that I’d made, which I now include below just because of its beauty.) For my talk, what I learned from this preprint got rolled up into an almost throwaway comment about efficiency and yield, but that just doesn’t do justice to the thoughts it provoked. The preprint, entitled The Statistical Physics of Self-Replication, was by Jeremy England of MIT and was a joy to read.

The preprint contained a rough estimate of the lower bound of heat energy generated due to bacterial replication. What was thrilling about this paper to me is that it is exactly the kind of thought process and calculation that drew me to physics as a field of study. There is something vital and important, not only for the process of science but for sheer satisfaction, about putting a limit on something. This process of putting a stake into the ground and saying “I believe it stops here” is a challenge to the world at large. In that challenge, you are daring the world to either prove you wrong or carry it to the next level.

The paper might not be accurate. There were some assumptions made that may or may not be good ones. To some extent, that matters less than the fact that someone is making the calculation. The paper is also a reminder to me that when I do the same sort of calculations for pleasure, I ought to consider putting more work into them, to see what’s been done before and to see if I have anything unique to add.