Remembering Dr. Jim Bassingthwaighte

Leave a message for Dr. Bassingthwaighte's family and colleagues, below (you will need to login to the wiki first):

  • . Grace Peng (09/08/22)
    • Jim made an indelible impact on the IMAG Multicale Modeling Consortium.  As a founding member, he instilled a quiet yet urgent appeal for reproducible, shareable computer models; while providing consistent encouragement, inspiration and hope for me and many future generations of multiscale modelers.  IMAG is proud to host the Physiome Project on this wiki.
  • Herbert Sauro (09/09/2022)
    • I met Jim 15 years ago when I was invited to give a talk at a mini-workshop on systems biology at the liberal arts college in Beloit, Wisconsin. It turns out Jim was also at the workshop and so we got to know each other. That started a 15 year long friendship and to this day I think about how much he helped me over those years. I wouldn't be where I am now if it wasn't for Jim. That was his nature, he would help people whenever he could. There was always a trail of people coming to this office at work to ask to him for advice, talk about science or just idle chat. In my world Jim was one of those rare scientists, someone born to be a scientist and right up to his final days he still read papers and took conference calls. He enthusiasm and curiosity about the natural world was undiminished to the very end. In the last 2 or 3 years, for example, he developed a unique correction to Michaelis-Menten kinetic equations (technically the Briggs-Haldane formulation) to account for delays in substrate binding, something that many of us ignore at our peril when constructing metabolic models. 
  • Rob MacLeod (9/10/2022)
    • I first met Jim virtually 20+ years ago when he and I started a lively debate about whether journal reviewers (he was Annals of BME Editor) should have anonymity.  It was a typical Jim conversation: open, frank, thoughtful, and fun.  Jim in person during many subsequent conversations and meetings was just as open, frank, thoughtful, and fun!  We will miss him. 
  • Ahmet Erdemir (9/12/2022)
    • I met Jim more than a decade ago when I first participated at the Multiscale Modeling Consortium. As a wet behind the ears investigator, I was star struck and probably didn't know what to say to this giant figure of modeling and simulation community. Jim was always so generous with his time, ideas, and friendship, which resulted in many deep discussions through the years that shaped my modeling philosophy and collegiality. I am certain that Jim left a long-lasting mark on many young scientists who happened to cross paths with him. He will be missed dearly and we will continue to learn from his legacy. 
  • Jeff Saucerman (9/12/2022)
    • Jim was a great encourager and promoter of emerging scientists, not just of me but of many others in this community. His optimistic vision was inspiring.
       
  • Chris Johnson (9/12/2022)
    • I knew Jim for 25+ (maybe 30+) years.  I went back and looked at some of my old emails from Jim.  I found one that I thought you/others might find interesting.  In response to a PITAC Report, NSF created the ITR program.  I was part of a team with Jim putting together a proposal.

      Below are Jim’s enthusiastic thoughts on biological pattern modeling and computing.  Jim was one of the most enthusiastic scientists I have ever met.  He seemed to have boundless energy and used it for the good of science.

      -------------------------------------------------------------------------------------------------------

      Begin forwarded message:

      From: "James B. Bassingthwaighte 5-2012" <jbb@bioeng.washington.edu>

      Subject: Pattern computing and pattern modeling

      Date: December 4, 2000 at 8:09:34 PM MST

      This is probably all old stuff to you, but I spent yesterday [dreaming] of biological patterns and computing on them.

      I like Bramley's admonition to be general.. this [isn’t] just for biology, [but of] any complex large system.  So here are my ramblings to try to tell about the idea to tackle two great challenges: understanding complex systems, computing complex systems in [reasonable] time so the computation is a thinking tool.

      Jim

      James B. Bassingthwaighte, MD, PhD    email:jbb@bioeng.washington.edu

      Prof. Bioengineering, Radiology       ph: 206-685-2005(sec) or -685-2012(direct)

      University of Washington, Box35-7962  FAX  206-685-2651

      Seattle, WA 98195-7962       website:http://nsr.bioeng.washington.edu

      1.  Computational and Biological Challenges of the ITR

      The challenges are clear, both for the biological modeling and for the integrative computation: One hundred thousand genes give rise to most of a million proteins in the human. These do NOT define life, which is much more demanding in terms of biology and informatics: spatial and temporal localizations, environmental conditions, history of developmental stresses, all give rise to phenotypes that are not directly predicable from the genome. For understanding biology, we need to use computation as a mind expander; to compute fast enough that computation is useful as a thinking tool we need new computational strategies to describe the behavior of physiological systems.

      One cannot imagine computers ever being fast enough to compute the behavior of the proteins in an adult cell, let alone an organism: Calculating on first principles the kinetics of 1 million factorial interactions is an overestimate, but even factorial 100,000 will take more time on the fastest imaginable computer more time than has existed since the Big Bang.  Predictive computing of this sort is therefore impossible.

      This creates a moral dilemma.  As one gives new drugs, or intervenes at the genetic level, it is imperative to make reasonably accurate predictions.  This is a major MACROethical issue.  One must plan to do the least harm, and yet one must make forward progress, because not to move ahead is also immoral and unethical since one knows that we CAN do better in providing health and maintaining the environment. This is a MACROethical issue since it goes beyond what an individual can do or not do, and becomes a societal issue: medical science MUST be advanced, but in the most thoughtful and least harmful way possible.  Thinking hard in this context means using computers to portray the patterns of biological responses to intervention.

      The two grand challenges must be faced together: new computational methods to describe the behavior of very large systems, and new ways of looking at biological systems. The reductionist route, whether one starts from genome up or from organism down to molecules, is impossible. That dynamical systems are inherently unpredictable is not the issue: it is simply that there is too much to compute. One can't build a truck out of quarks.

      The solution: Both computation and biological modeling need to be reduced to the computing and visualization of patterns. During the two past millennia we have worked to gain precision: mathematics went from geometry to algebra and calculus, then to differential equations and ever more massive calculations. Numerical accuracy has been the issue most often raised against the hope of learning biology through large scale systems analysis. Likewise biology has been revealed remarkably successfully through reductionist approaches: physiology, then biochemistry, cell biology, molecular biology and genetics.

      We understand successful modeling, as we understand Newtonian physics of planetary motions: the model is so good that nature seems no better: As James Bailey (After Thought, 1996) puts it, The vocabulary of the mathematical fiction inexorably becomes the vocabulary with which we describe the reality. This inevitably is revealed when we accept the nerve action potential as the Hodgkin-Huxley action potential, and use it as a building block.

      The new Physiome paradigm, if we pretend to legitimize it by giving it a name that implies an integrative approach, is to recognize biological, ecological, atmospheric and other systems behave in an assemblage of patterns. If a vasoconstrictor is given, there is a pattern of responses, cardiac, neural, pulmonary, renal, and systemic: this is well "understood" at the level of classical organ system physiology, where "understood" means well described and reportable in articles and textbooks, but does not mean explained causally through all the chemical and biophysical levels. Likewise, if the heart is paced from an abnormal site, so that the spread of excitation is different from what it is when the spread of excitation is from the normal sinus node origin, there is an emergent pattern of hypertrophy in some regions, atrophy in others, and the heart remodels itself. As a part of this pattern, there is changed energy  utilization locally, changed metabolisms and even the choice of substrate, and of course changed gene expression and translation of mRNA to proteins for metabolism, transport and for contraction. The pathways that regulate these patterns are for the most part not yet known, but we now know how to approach discovering the details of the patterns, and from those will emerge knowledge of how the patterns are evoked and regulated. Experimentally, this will be done using 100x100 (or larger) cDNA microarrays on samples taken over a sequence of times after starting the pacing and determining the patterns of change in expression to determine which proteins are in the pathways governing the changes. This is a standard approach which will yield to standard bioinformatics methods.

      To compute and predict on large scale biological systems requires new tools. One new tool is a modification of the standard function generator wherein an input vector x linearly defines an output vector y. Extend this to use the pattern descriptors as building blocks: a set of patterns on one group of systems leads to patterns of response in other systems. The patterns are summarized as "function generators", each of which is defined by a set of control variables, and whose output in time and space is a pattern predicted accurately from prior comprehensive detailed modeling. But within the function generator is no longer the biophysical, biochemical, or other set of equations, but only the captured behavioral relationships. This is not necessarily crude, and it is effective, greatly reducing computational time and thereby allowing prediction at a rate impossible if the original "first principle" equations were used. Thus in order to operate and predict at one hierarchical level, the nice complexities at the more refined levels become not just invisible, but actually vanish, waiting to be recalled by a curious observer who wishes to understand at a deeper level. Such computational simplification allows not just prediction, but should be so efficient that the computer is useful as an on-line thinking aid.

      (Such simplifications, making very comprehensive models into pattern generators, will fail when there are important interconnections between functional units that were unrecognized in the more basic formulations, so error correction mechanisms need to be built in; one would be to go back to the biophysical model to recompute a particular set of functions when conditions have changed; another would be to redesign the underlying biophysical model or impose a different set of thermodynamic constraints.)

      The relationships between different nodes or pattern generating components will be of a good many types: transforms into, controls and limits, switches to, combines with, and so on. Like a biochemical equation A+B <-> C+D there are several possible meanings so relationships will require explicit subdefinitions. But they can be made computable. Pattern computing is something I have no idea how to do; from my point of view that is a grand challenge. It is presumably an evolutionary set of steps beyond neural nets, genetic algorithms and simulated annealing and might well retain aspects of uncertainty, which would only be realistic. Heisenberg wins every battle over mathematical biophysics at the lowest levels, but integration wins out when there are enough players. Single channel data won't appear in whole cell models of Electrophysiology.

      Rob, Chris: This is too long again. I will try to boil it down, but I hope to have written enough that you can critique the ideas.

      Summary: A two pronged thrust is required in order to compute predictions of complex large scale system behavior: pattern computing and observations of patterns in behavior. While our application area is in the biology of the heart, the strategies should hold for ecosystems, environmental systems, and for biology in general.

  •  
Table sorting checkbox
Off