a bonobo humanity?

‘Rise above yourself and grasp the world’ Archimedes – attribution

Archive for the ‘biology’ Category

reveries of a solitary wa*ker: wa*k 1

leave a comment »

(Being a thousand words or so of mental drivel)

I’d prefer not to be coy about the title but I’ve a job to protect.

the delightful enthusiasm of children

the delightful enthusiasm of children

Began watching documentary series chronicles of the third reich, yet another rake-over of that terrible but ghoulishly fascinating period, and it kicked off with noted historian Ian Kershaw saying that the regime was unique in that it aimed to overthrow the entire Judeo-Christian system of ethics that sustained western Europe for centuries. Bullshit I say. No such thing. What nazism was overthrowing, or delaying or subverting, was the progress of western Europe, for example the Renaissance and the Enlightenment, movements towards democracy, individual liberty, internationalism, none of which owed anything to the Judeo-Christian belief system. This lazy thinking and remarking continually goes unchallenged. At the height of Judeo-Christian control we had monarchical dictatorships, divine right, religious authoritarianism, extreme corruption, torture, rigid hierarchies, feudal slavery, etc, a world of inhumanity and brutality. Not saying that Christianity caused this, life wouldn’t have been any better in China or Japan, doubtless. Depended on chance and ‘birthright’ as to how well you fared.

567249

Reading the big bio of Darwin by Desmond and Moore, thinking how so much that was radical or extreme becomes mainstream within a few generations, such as materialism, atheism, democratic principles, equality for women, humans as apes. Chartism’s aims – extension of suffrage, taxation reform, the repeal of laws too unjust to be enacted nowadays, all horrific to the upper classes, who armed themselves with crowbars to protect their homes and privileges. And among them, quite a few favouring transmutation (though not of the Darwinian kind – more a sort of Lamarckian progressive development towards the human pinnacle) and atheistic science. Makes you think of today’s accelerating trends, e.g gay marriage. All these ideas were opposed because they would bring down civilisation as we know it. Rock n roll was another one.
Also thinking how science threatened and continues to threaten religion. Moslem student asked me last week, do you think humans come from apes? Could see what his hopes were, was happy to crush them and move on. No doubt he’ll return to Saudi, ask the question again and be reassured as to his human specialness. But maybe not. But in Darwin’s day, so many associates, Sedgwick, Henslow, Lyell, Owen, Whewell, even Herschel, even bloody Wallace, couldn’t countenance our ‘demotion’ to a primate, on grounds some of them didn’t even recognise as religious. How can it possibly be argued that religion and science are compatible? Only if we have a very different religion, and perhaps a very different science – panpsychism, spooky action at a distance, positively conscious positrons.

A love-hate thing with Darwin, all his stuffy aristocratic connectedness, his family’s money, but then his boldness of ideas, but then his timidity born of an unwillingness to offend, a need to be admired, feted, but two kinds of glory, the one for a grand idea that might just outlast the opprobrium of his elite class in mid-nineteenth century England, the other for being a model member of that class, civilized, restrained, highly intelligent, pushing gently outwards the boundaries of knowledge. The tension between immediate, hail-fellow-well-met acceptance and something more, his dangerous idea, something barely digestible but profoundly transformative.

cover

Keep reading about the hard problem of consciousness, without greatly focusing. Don’t really believe in it. We’re surely just at the beginning of getting to grips with this stuff – but how much time do we have? Dennett talks of the mind as cultural construct, Cartesian theatre as he calls it, and you don’t need to have ever heard of Descartes to wonder at how memories, rehearsals, fantasies can be played out inside the head, inaccessible to everyone but yourself, but without the boundaries of the skull, or of a theatre, no straightforward boundaries of space or time, yet composed of reality-bits, physical and emotional. One of my first serious wonderings, I seem to remember (not trustworthy) was about this boundary-less but secret place-thing called the mind. Not sure about a cultural construct, seemed very real and self-evident to me, and a wonderful safe haven where you can think and do things for which you’ll never get arrested, never have to apologise, a theatre of blood, sex and brilliance…

But I don’t think I thought then, and I don’t think now, that this was anything other than a product of the brain because to me the brain was like every other organ, the heart, the liver, the kidneys, the lungs, they were all mysterious, I didn’t know how any of them worked, and though I knew that I could learn a lot more about them, and would over the course of my life, I suspected that nobody knew everything about how any of them functioned, and the brain was just more complex and so would contain more mysteries than any of the others perhaps put together, but it had to come from the brain because, well everybody said thoughts were produced by the brain and these were just thoughts after all and where else could they come from – there was no alternative. And it seems we’re slowly nutting it out, but humans are understandably impatient to find answers, solutions. We like to give prizes for them.

9780544228108_p0_v2_s260x420

Also reading Natalie Angier’s Woman, a revised version of a book brought out in the nineties. It’s a popular biology book from a good feminist perspective, and I’m learning much about breast milk and infant formula, about the breast itself, about menstruation, about the controversies around hysterectomies and so on, but her style often irritates, drawing attention to too much clever-clever writing rather than the subject at hand. It’s a tricky area, you want your writing lively and engaging, not like reading an encyclopedia, but especially with science writing you want it all to be comprehensible and transparent – like an encyclopedia. Angier sometimes uses metaphors and puns and (for me) arcane pop references which have me scratching my head and losing the plot, but to be fair it’s worth persevering for the content. But it shouldn’t be about persevering.

on vaccines and type 1 diabetes, part 3 – causes

leave a comment »

imrs.php

 

As mentioned earlier, it’s not precisely known what causes diabetes type 1, more commonly known as childhood diabetes. There’s a genetic component, but it’s clearly environmental factors that are leading to the recent apparently rapid rise in this type.

I use the word ‘apparent’ because it’s actually hard to put figures on this rise, due to a paucity of historical records. This very thorough and informative article, already 12 years old, from the ADA (American Diabetes Association – an excellent website for everything to do with the evidence and the science on diabetes), tries to gather together the patchy worldwide data to cover the changing demography and the evolving disease process. At the beginning of the 2oth century childhood diabetes was rare but commonly fatal (before insulin), and even by mid-century no clear rise in childhood incidence had been recorded. To quote the article, ‘even by 1980 only a handful of studies were available, the “hot spots” in Finland and Sardinia were unrecognized, and no adequate estimates were available for 90% of the world’s population’. Blood glucose testing in the early 20th century was far from being as simple a matter as it is today, and the extent of undiagnosed cases is hard to determine.

There’s no doubt, however, that in those countries keeping reliable data, such as Norway and Denmark, a marked upturn in incidence occurred from the mid 20th century, followed by a levelling out from the 1980s. Studies from Sardinia and the Netherlands have found a similar pattern, but in Finland the increase from mid-century has been quite linear, with no levelling out. Data from other northern European countries and the USA, though less comprehensive, show a similar mid-century upturn. Canada now (or as of 12 years ago) has the third highest rate of childhood diabetes in the world. The trend seems to have been that many of the more developed countries first showed a sharp increase, followed by something of a slow-down, and then other countries, such as those of central and eastern Europe and the Middle East, ‘played catch-up’. Kuwait, for example, had reached seventh in the world at the time of the article, confounding many beliefs about the extent of the disease’s genetic component.

The article is admirably careful not to rush to conclusions about causes. It may be that a number of environmental factors have converged to bring about the rise in incidence. For example, it’s known that rapid growth in early childhood increases the risk, and children do in fact grow faster on average than they did a century ago. Obesity may also be a factor. Baffled researchers naturally look for something new that has entered the childhood environment, either in terms of nutrition (e.g. increased exposure to cow’s milk) or infection (enteroviruses). Neither of these possibilities fit the pattern of incidence in any obvious way, though there may be subtle changes in antigenicity or exposure at different stages of development, but there’s scant evidence of these.

Another line of inquiry is the possible loss of protective factors, as part of the somewhat vague but popular ‘hygiene hypothesis’, which argues that lack of early immune system stimulation creates greater susceptibility, particularly to allergies and asthma, but perhaps also to childhood diabetes and other conditions. The ADA article has this comment:

Epidemiological evidence for the hygiene hypothesis is inconsistent for childhood type 1 diabetes, but it is notorious that the NOD mouse is less likely to develop diabetes in the presence of pinworms and other infections. Pinworm infestation was common in the childhood populations of Europe and North America around the mid-century, and this potentially protective exposure has largely been lost since that time.

The NOD (non-obese diabetic) strain of mice was developed in Japan as an animal model for type 1 diabetes.

The bottom line from all this is that more research and monitoring of the disease needs to be done. Type 1 diabetes is a complex challenge to our understanding of the human immune system, and of the infinitely varied feedback loops between genetics and environment, requiring perhaps a broader questioning and analysis than has been applied thus far. Again I’ll quote, finally, from the ADA article:

In conclusion, the quest to understand type 1 diabetes has largely been driven by the mechanistic approach, which has striven to characterize the disease in terms of defining molecular abnormalities. This goal has proved elusive. Given the complexity and diversity of biological systems, it seems increasingly likely that the mechanistic approach will need to be supplemented by a more ecological concept of balanced competition between complex biological processes, a dynamic interaction with more than one possible outcome. The traditional antithesis between genes and environment assumed that genes were hardwired into the phenotype, whereas growth and early adaptation to the environment are now viewed as an interactive process in which early experience of the outside world is fed back to determine lasting patterns of gene expression. The biological signature of each individual thus derives from a dynamic process of adaptation, a process with a history.

However, none of this appears to provide any backing for those who claim that a vaccine is responsible for the increased prevalence of the condition. So let’s wade into this specific claim.

It seems the principle claim of the anti-vaxxers is that vaccines suppress our natural immune system. This is the basic claim, for example, of Dr Josef Mercola, a prominent and heavily self-advertising anti-vaxxer whose various sites happen to come up first when you combine and google key terms such as ‘vaccination’ and ‘natural immunity’. Mercola’s railings against vaccination, microwaves, sunscreens and HIV (it’s harmless) have garnered him quite a following among the non compos mentis, but you should be chary of leaping in horror from his grasp into the waiting arms of the next site on the list, that of the Vaccination Awareness Network (VAN), another Yank site chock-full of of BS about the uselessness of and the harm caused by every vaccine ever developed, some of it impressively technical-sounding, but accompanied by ‘research links’ that either go nowhere or to tabloid news reports. Watch out too for the National Vaccination Information Centre (NVIC), another anti-vax front, full of heart-rending anecdotes which omit everything required to make an informed assessment. The best may seem to lack conviction, being skeptics and all, but it’s surely true that the worst are full of passionate intensity.

There is no evidence that the small volumes of targeted antigens introduced into our bodies by vaccines have any negative impact on our highly complex immune system. This would be well-nigh impossible to test for, and the best we might do is look for a correlation between vaccination and increased (or decreased) levels of disease incidence. No such correlation has been found between the MMR vaccine and diabetes, though this Italian paper did find a statistically significant association between the incidence of mumps and rubella viral infections and the onset of type 1 diabetes. Another paper from Finland found that the incidence of type 1 diabetes levelled out after the introduction of the MMR vaccine there, and that the presence of mumps antibodies was reduced in diabetic children after vaccination. This is a mixed result, but as yet there haven’t been any follow-up studies.

To conclude, there is just no substantive evidence of any kind to justify all the hyperventilating.

But to return to the conversation with colleagues that set off this bit of exploration, it concluded rather blandly with the claim that, ‘yes of course vaccinations have done more good than harm, but maybe the MMR vaccine isn’t so necessary’. One colleague took a ‘neutral’ stance. ‘I know kids that haven’t been vaccinated, and they’ve come to no harm, and I know kids that have, and they’ve come to no harm either. And measles and mumps, they’re everyday diseases, and relatively harmless, it’s probably not such a bad thing to contract them…’

But this is a false neutrality. Firstly, when large numbers of parents choose not to immunise their kids, it puts other kids at risk, as the graph at the top shows. And secondly, these are not harmless diseases. Take measles. While writing this, I had a memory of someone I worked with over twenty years ago. He had great thick lenses in his glasses. I wear glasses too, and we talked about our eye defects. ‘I had pretty well perfect vision as a kid,’ he told me, ‘and I always sat at the back of the class. Then I got measles and was off school for a fortnight. When I went back, sat at the back, couldn’t see a thing. Got my eyes tested and found out they were shot to buggery.’

Anecdotal evidence! Well, it’s well known that blindness and serious eye defects are a major complication of measles, which remains a killer disease in many of the poorest countries in the world. In fact, measles blindness is the single leading cause of blindness in those countries, with an estimated 15,000 to 60,000 cases a year. So pat yourself on the back for living in a rich country.

In 2013, some 145,700 people died from measles – mostly young children. In 1980, before vaccination was widely implemented, an estimated 2.6 million died annually from measles, according to the WHO.

Faced with such knowledge, claims to ‘neutrality’ are hardly forgivable.

Written by stewart henderson

January 30, 2015 at 6:02 pm

what is autism and what causes it?

leave a comment »

Brain_Autism

The term ‘autism’ was coined in the 1940s by two physicians working independently of each other, Hans Asperger in Austria and Leo Kanner in the USA, to describe a syndrome the key feature of which was a problem with interacting with others in ‘normal’ ways. Sounds vague, but the problem was anything but wishy-washy to these individuals’ parents and families, and over time a more detailed profile has built up.

The term itself is from the Greek autos, or ‘self’, because those with the syndrome had clear difficulties in interpreting others’ moods and responses, resulting in a withdrawn, often antisocial state. Autistic kids often avoid eye contact and are all at sea over the simplest communication.

Already though, I feel I’m saying too much. When describing autism, it’s common to use words like ‘often’ or ‘sometimes’ or ‘some’, because the symptoms are seemingly so disparate. Much of what follows relies on the neurologist V S Ramachandran’s book The tell-tale brain, especially chapter 5, ‘Where is Steven? The riddle of autism’.

Autistic symptoms can be categorised in two major groups, social-cognitive and sensorimotor. The social-cognitive symptoms include mental aloneness and a lack of contact with the world of other humans, an inability to engage in conversation and a lack of emotional empathy. Also a lack of any overt ‘playfulness’ or sense of make-believe in childhood. These symptoms can be ‘countered’ by heightened, sometimes obsessive interest in the inanimate world – e.g. the memorising of ostensibly useless data, such as lists of phone numbers.

On the sensorimotor side, symptoms include over-sensitivity and intolerance to noise, a fear of change or novelty, and an intense devotion to routine. There’s also a physical repetitiveness of actions and performances, and regular rocking motions.

These two types of symptoms raise an obvious question – how are the two types connected to each other? We’ll return to that.

Another motor symptom, which Ramachandran thinks is key, is a difficulty in physically imitating the actions of others. This has led him to pursue the hypothesis that autism is essentially the result of a deficiency in the mirror neuron system.

In recent years there’s been a lot of excitement about mirror neurons – possibly too much, according to some neurologists. A mirror neuron is one that fires not only when we perform an action but also when we observe it being performed by others. They’ve been found to act in mammals and also, it seems, in birds, and in humans they’ve been found in the premotor cortex, the supplementary motor area, the primary somatosensory cortex and the inferior parietal cortex. It’s easier, however, to locate them than it is to determine their function. Clearly, to describe them as ‘responsible’ for empathy, or intention, is to go too far. As Patricia Churchland points out, ‘a neuron is just a neuron’, and what we describe as empathy or intention will likely involve a plethora of high-order processes and connections, in which mirror neurons will play their part.

With that caveat in mind, let’s continue with Ramachandran’s speculations on autism and mirror neurons. First, we’ll need to be reminded of the term ‘theory of mind’, used regularly in psychology. It’s basically the idea that we attribute to others the same sorts of intentions and desires that we have because of the assumption that they, like us, have that internal feeling and processing and regulating system we call a ‘mind’. A sophisticated theory of mind is one of the most distinctive features of the human species, one which gives us a unique kind of social intelligence. That autism would be related to theory-of-mind deficiencies seems a reasonable assumption, so what is the brain circuitry behind theory of mind, and how do mirror neurons fit into this picture?

Although neuro-imaging has revealed that autistic children have larger brains with larger ventricles (brain cavities) and notably different activity within the cerebellum, this hasn’t helped researchers much, because autism sufferers don’t present any of the usual symptoms of cerebellum damage. It could be that these changes are simply the side effects of genes which produce autism. Some researchers felt it was better to focus on mirror neurons straight-off, as obvious suspects, and to see how they fired and where they connected in particular situations. They used EEG (electroencephalography) as a non-invasive way to observe mirror neuron activity. They focused on the suppression of mu waves, a type of brain wave. It has long been known that mu waves are suppressed when a person makes any volitional movement, and more recently it has been discovered that the same suppression occurs when we watch others performing such movements.

So researchers used EEG (involving electrodes placed on the scalp) to monitor neuronal activity in a medium-functioning autistic child, Justin. Justin exhibited a suppressed mu wave, as expected, when asked to make voluntary movements. However, he didn’t show the same suppression when watching others perform those movements, as ‘neurotypical’ children do. It seemed that his motor-command system was functioning more or less normally, but his mirror-neuron system was deficient. This finding has been replicated many times, using a variety of techniques, including MEG (magnetoencephalography). fMRI, and TMS (transcranial magnetic stimulation). Reading about all these techniques would be a mind-altering experience in itself.

According to Ramachandran, all these confirmations ‘provide conclusive evidence that the [mirror neuron] hypothesis is correct.’ It certainly helps to explain why a subset of autistic children have trouble with metaphors and literality. They have difficulty separating the physical and the referential, a separation that mirror neurons appear to mediate somehow.

A well-developed theory of mind which can anticipate the behaviour of others is clearly a feature of understanding our own minds better. In Ramachandran’s words:

If the mirror-neuron system underlies theory of mind and if theory of mind in normal humans is supercharged by being applied inward, towards the self, this would explain why autistic individuals find social interaction and strong self-identification so difficult, and why so many autistic children have a hard time correctly using the pronouns ‘I’ and ‘you’ in conversation. They may lack a mature-enough self-representation to understand the distinction.

Of course, tons more can be said about the ‘mirror network’ and tons more research remains to be done, but there are many promising signs. For example, the findings about lack of mu wave suppression could be used as a diagnostic tool for the early detection of autism, and some interesting work is being done on the use of biofeedback to treat the disorder. Biofeedback is a process whereby physiological signals picked up by a machine from the brain or body of a subject are represented to the subject in such a way that he or she might be able to affect or manipulate that signal by a conscious change of behaviour or thinking. Experiments have been done to show that subjects can alter their own brain waves through this process. Some experimental work is also being done with drugs such as MDMA (otherwise known as the party drug ‘ecstacy’) which appear to enhance empathy through their action on neurotransmitter release.

So that’s a very brief introduction to autism. Hopefully I’ll come back to it in the future to explore the progress being made in understanding and treating the syndrome.

Written by stewart henderson

October 23, 2013 at 10:25 am

what does curiosity actually mean?

with 2 comments

Robert Hooke, star of the early Royal Society

Robert Hooke, star of the early Royal Society

You might say that Philip Ball has performed a curious task with his book, Curiosity. He’s taken this term, which we moderns might take for granted, and examined what intellectuals and the public have made of it down through the ages – with a particular focus on that wobbly symbol of the seventeenth century British scientific enlightenment, the Royal Society. I’ve been spending a bit of time in the seventeenth century lately, what with Dava Sobel’s book on the struggle to measure longitude, Matthew Cobb’s book on the untangling of the problem of eggs and sperm and conception, not to mention Bill Bryson’s lively treatment of Hooke, Leeuwenhoek and cells and protozoa in A Short History of Nearly Everything.

That century, with some of its most interesting actors, including Francis Bacon, René Descartes, William Harvey, Jan Swammerdam, Nicolas Steno, Johann Komensky (aka Comenius), Samuel Butler, Thomas Hobbes, Robert Hooke, Robert Boyle, Antonie van Leeuwenhoek, Thomas Shadwell, Margaret Cavendish and Isaac Newton, represented a great testing period for science and its reception by the public. Curiosity has always had its enemies, and still does, as evidenced by some Papal pronouncements of recent years, but in earlier, more universally religious times, knowledge and its pursuit were treated with great wariness and suspicion, a suspicion sanctioned by the Biblical tale of the fall. The Catholic Church had risen to a position of great power in the west, though the revolting Lutherans, Anglicans, Calvinists and their ilk had spoiled the party somewhat, and England in particular, having grown in pride and prosperity during the Elizabethan period, was flexing its muscles and exercising its grey matter in exciting new ways. The sense of renovation was captured by  the versatile Bacon, with works like the Novum Organum (New Method), The New Atlantis and The Advancement of Learning.

In the past I’ve described curiosity and scepticism as the twin pillars of the scientific mindset, but they’re really more like a pair of essential forces that interact and modify each other. Scepticism without curiosity is just pure negativity and nihilism, curiosity without scepticism is directionless and naive.

But perhaps that’s overly glib. What, if any, are the limits of curiosity, and when is it a bad thing? It killed the cat, after all.

The word derives from the Latin ‘cura’, meaning care. Think of the word ‘curator’. However, if you think of one of the most curious works of the ancients, Pliny the Elder’s Natural History, you’d have to say, from a modern perspective, that little care was taken to separate truth from fiction in his massive and sometimes bizarre collection of curios. This sort of unfiltered inclusivity in collecting ‘facts’ and stories goes back at least to Herodotus, the ‘father of lies’ as well as of history, and it goes forward to medieval bestiaries and herbaria. These collections of the weird and wonderful were, of course, not intended to be scientific in the modern sense. The term ‘science’ wasn’t in currency and no clear scientific methodologies had been elaborated. As to curiosity, it certainly wasn’t a fixed term, and after the political establishment of Christianity, it was more often than not seen in a negative light. ‘We want no curious disputation after possessing [i.e. accepting the truth of] Jesus Christ’, wrote Tertullian in the early Christian days. Another early Christian, Lactantius [c240-c320], explained that the reason Adam and Eve were created last was so that they’d remain forever ignorant of how their god created everything else. That was how it was intended to be. Modern creationists follow this tradition – God did it, we don’t know how and we don’t really care.

Fast forward to Francis Bacon, who still, in the early 17th century, had to contend with the view of curiosity as a sinful extravagance, a view that had dominated Europe for almost a millennium and a half. Bacon had quite a pragmatic, almost business-like view of curiosity as a tool to benefit humanity. The ‘cabinet of curiosities’ was becoming well established in his time, and Bacon advised all monarchs, indeed all rich and powerful men, to maintain one, well sorted and labelled, as if to do so would be magically empowering. The problem with these cabinets, though, was that there was little understanding about the relations between entities and articles. That’s to say, there was little that was modernly scientific about them. Their objects were largely unrelated rarities and oddities, having only one thing in common, that they were ‘curious’. Bacon recognised that this wouldn’t quite do, and tried to point a way forward. He didn’t entirely succeed, but – small steps.

Ball’s book is at pains to correct, or at least provide nuance to, the standard view of Bacon as initiator of and father-figure to the British scientific enlightenment. In fact, Bacon may have been a Rosicrucian, and his utopian New Atlantis describes a more or less priestly caste of technical experts, living and working in Solomon’s House, and keeping their arts and knowledge largely under wraps, like the alchemists and mages of earlier generations. Bacon, with his government connections and his obvious ambition to be benefited by as well as benefiting the state, was concerned to harness knowledge to productivity and profit, and those who see science largely as a coercion of nature have cursed him for it ever since. Mining and metallurgy, engineering and manufacturing were his first subjects, but he also imagined great changes in agriculture – the breeding of plants, fruits and flowers, as well as animals, to create ‘super-organisms’, in and out of season, for our benefit and delight. The art and science of the kitchens of Solomon’s House produces superior dishes, as well as wines and other beverages, and printing and textiles have advanced greatly, with new fabrics, papers, dyes and machinery. Even the weather is subject to manipulation, with rain, snow and sunshine under the control of the savants. The details of all these advancements are kept vague of course, (and here’s where Bacon’s insistence on ‘secret knowledge’ plays to his advantage, a point not sufficiently noted by Ball in his need to connect Bacon with the the alchemist-magicians of the past) but what is represented here is promise, a faith in human ingenuity to improve on the products of the natural world.

In focusing on all these benefits, Bacon manages largely to sidestep the religious aversion to curiosity as a form of intellectual avarice. However, Bacon and his more curious compatriots were never too far from the magical dark arts. Few intellectuals of this period, for example, would have dismissed alchemy out of hand, in spite of Chaucer’s delicious mockery of it over 200 years before, or Ben Jonson’s more contemporaneous take in The Alchemist. What differentiated Bacon was an interest in system, however vaguely adumbrated, and a harnessing of this system to the interests of the state.

Bacon tried to interest James I in a state sponsored proto-scientific institution, but this got nowhere, largely because he couldn’t devise anything like a practical program for such an entity, but a generation or two after his death, after a civil war, a brief republic and a restoration, the Royal Society was formed under the more or less indifferent patronage of Charles II. Bacon was seen as its guiding spirit, and there was an expectation, or hope, that its members would be virtuosi, a term then in currency. As Ball explains:

The virtuoso was ‘a rational artist in all things’… meaning the arts as well as the sciences, pursued methodically with a scientist’s understanding of perspective, anatomy and so forth. (It is after all in the arts that the epithet ‘virtuoso’ survives today.) The virtuoso was permitted, indeed expected, to indulge pure curiosity: to pry into any aspect of nature or art, no matter how trivial, for the sake of knowing. There was no sense that this impulse need be harnessed and disciplined by anything resembling a systematic program, or by an attempt to generalise from particulars to overarching theories.

Charles II, in spite of having some scientific pretensions, paid scant attention to his own Society, and neglected to fund it. What was perhaps worse for the Society was his amused approval of a hit play of the time, Thomas Shadwell’s The Virtuoso, which satirized the Society through its central character, Sir Nicholas Gimcrack. The play, as well as many criticisms of the Society’s practices by the likes of the philosopher Thomas Hobbes and the aristocratic Margaret Cavendish (Duchess of Newcastle-upon-Tyne), presented another kind of negativity vis-a-vis unbridled curiosity, more modern, if not more pointed than the old religious objections.

The play-goer first encounters Sir Nicholas Gimcrack lying on a table making swimming motions. He tells his visitors that he’s learning to swim, but they are dubious about his method. His response:

I content myself with the speculative part of swimming; I care not for the practick. I seldom bring anything to use; tis not my way. Knowledge is my ultimate end.

This was the updated criticism. Pointless observations and experiments, leading nowhere and of no practical use. Gimcrack appears to have been based on Robert Hooke, one of the Royal Society’s most brilliant members, who was suitably enraged on viewing the play. Shadwell mocked Hooke’s prized invention, the air pump, intended to create a vacuum for the purpose of observing objects inserted into it, and he presented a jaundiced view of Gimcrack, through the dialogue of his niece, as ‘a sot that has spent two thousand pounds in microscopes to find out the nature of eels in vinegar, mites in a cheese, and the blue of plums.’ These were all examined in Hooke’s ground-breaking and breath-taking work Micrographia.

Most of Shadwell’s mockery hasn’t stood the test of time, but he was far from the only one who targeted the practices and the approach of the Society and of ‘virtuosi’, sometimes with humour, sometimes with indignation. Their criticisms are worth examining, both for what they reveal of the era, and for their occasional relevance today. Many of them seem totally misplaced – mocking the ‘weighing of air’, which they naturally saw as the weighing of nothing, or the examining, through the newish tool the microscope, of a gnat’s leg. It should be recalled that Hooke, through his microscopic investigations, was the first to highlight and to name the individual cell. Yet it was a common criticism of the era, due largely to the ignorance of the interconnectedness of all things that the scientifically literate now take for granted, that these explorations were simply time-wasting dilettantism. The philosophical curmudgeon Thomas Hobbes, for example, firmly believed that experiments couldn’t produce significant truths about the world. It seems that the general public, who didn’t have access to such things, saw microscopes and telescopes as magical devices which didn’t so much reveal new worlds as to create them. If they couldn’t be verified with one’s own eyes, how could these visions be trusted? And there was the old religious argument that we weren’t meant to see them, that we should keep to our god-given limitations.

Generally speaking, as Ball describes it, though the criticisms and misgivings weren’t so clearly religious as they had been, they centred on a suspicion about unrestrained curiosity and questioning, which might lead to an undermining of the social order (a big issue after the recent upheavals in England), and to atheism (they were on the money with that one). They had a big impact on the Royal Society, which struggled to survive in the late seventeenth and early eighteenth centuries. It’s worth noting too, that the later eighteenth century Enlightenment on the continent was much more political and social than scientific.

But rather than try to analyse these criticisms, I’ll provide a rich sample of them, without comment. None of them are ‘representative’, but together they give a flavour of the times, or of the more conservative feeling of the time.

[Is there] anything more Absurd and Impertinent than a Man who has so great a concern upon his Hands as the Preparing for Eternity, all busy and taken up with Quadrants, and Telescopes, Furnaces, Syphons and Air-pumps?

John Norris, Reflections on the conduct of human life, 1690

Through worlds unnumber’d though the God be known,

‘Tis ours to trace him only in our own….

The bliss of man (could pride that blessing find)

Is not to act or think beyond mankind;

No powers of body or of soul to share,

But what his nature and his state can bear.

Why has not a man a microscopic eye?

For this plain reason, man is not a fly.

Say what the use, were finer optics giv’n,

T’inspect a mite, not comprehend the heav’n? …

Then say not man’s imperfect, Heav’n in fault;

Say rather, man’s as perfect as he ought:

His knowledge measur’d to his state and place,

His time a moment, and a point his space.

Alexander Pope, An Essay on Man

There are some men whose heads are so oddly turned this way, that though they are utter strangers to the common occurrences of life, they are able to discover the sex of a cockle, or describe the generation of a mite, in all its circumstances. They are so little versed in the world, that they scarce know a horse from an ox; but at the same time will tell you, with a great deal of gravity, that a flea is a rhinoceros, and a snail an hermaphrodite.

… the mind of man… is capable of much higher contemplations [and] should not be altogether fixed upon such mean and disproportionate objects.

Joseph Addison, The Tatler, 1710

But could Experimental Philosophers find out more beneficial Arts then our Fore-fathers have done, either for the better increase of Vegetables and brute Animals to nourish our bodies, or better and commodious contrivances in the Art of Architecture to build us houses… it would not onely be worth their labour, but of as much praise as could be given to them: But as Boys that play with watry Bubbles, or fling Dust into each others Eyes, or make a Hobby-horse of Snow, are worthy of reproof rather then praise, for wasting their time with useless sports; so those that addict themselves to unprofitable Arts, spend more time then they reap benefit thereby… they will never be able to spin Silk, Thred, or Wool, &c. from loose Atomes; neither will Weavers weave a Web of Light from the Sun’s Rays, nor an Architect build an House of the bubbles of Water and Air…  and if a Painter should draw a Lowse as big as a Crab, and of that shape as the Microscope presents, can any body imagine that a Beggar would believe it to be true? but if he did, what advantage would it be to the Beggar? for it doth neither instruct him how to avoid breeding them, or how to catch them, or to hinder them from biting.

[Inventors of telescopes etc] have done the world more injury than benefit; for this art has intoxicated so many men’s brains, and wholly employed their thoughts and bodily actions about phenomena, or the exterior figures of objects, as all better arts and studies are laid aside.

Margaret Cavendish, Observations upon Experimental Philosophy, 1666

[A virtuoso is one who] has abandoned the society of men for that of Insects, Worms, Grubbs, Maggots, Flies, Moths, Locusts, Beetles, Spiders, Grasshoppers, Snails, Lizards and Tortoises….

To what purpose is it, that these Gentlemen ransack all Parts both of Earth and Sea to procure these Triffles?… I know that the desire of knowledge, and the discovery of things yet unknown is the pretence; but what Knowledge is it? What Discoveries do we owe to their Labours? It is only the discovery of some few unheeded Varieties of Plants, Shells, or Insects, unheeded only because useless; and the knowledge, they boast so much of, is no more than a Register of their Names and Marks of Distinction only.

Mary Astell, The character of a virtuoso, 1696

There are many other such comments, very various, some attempting to be witty, others indignant or contemptuous, and some quite astute – the Royal Society did have more than its share of dabblers and dilettantes, and was far from being simply ‘open to talents’ – but for the most parts the criticisms haven’t dated well. You won’t see The Virtuoso in your local playhouse in the near future. Wide-ranging curiosity, mixed with a big dose of scepticism and critical analysis of what the contemporary knowledge provides, has proved itself many times over in the development of scientific theory and an ever-expanding world view, taking us very far from the supposedly ‘better arts and studies’ the seventeenth century pundits thought we should be occupied by, but also making us realize that the science that has flowed from curiosity has mightily informed those ‘better arts and studies’, which can be perhaps best summarized by the four Kantian questions, Who are we? What do we know? What should we do? and What can we hope for?

how did blue whales get so big?

with 2 comments

a baby blue

a baby blue

Cetaceans came into being when a group of mammals left the land some 55 million years ago, to return to the oceans (creatures first left the oceans for the land some 375 million years ago). The closest land species to whales are the artiodactyls or even-toed ungulates, a large group which includes sheep, goats, cattle, giraffes, camels, llamas, pigs and deer, but another artiodactyl species, the hippo, is most closely related to cetaceans. But, of course, since returning to the oceans, the creatures who finally evolved into cetaceans were able to become ‘super-sized’. The blue whale, likely the largest creature ever to exist on this planet, can tip the scales at over 170 tonnes, and can measure well over 30 metres.  The largest dinosaur unearthed so far, Argentinosaurus, a titanosaur sauropod (that’s to say a really effing big dino – named for the ancient mythical titans – with a long neck and tail and a comparatively small head, like the brontosaurus of my youth, now sadly out of favour) weighed around 75 tonnes.

Cetaceans have managed to fill a diverse range of ecological niches. Some of the best-known are the blue whale (a filter-feeding baleen whale or rorqual), the orca (often called a killer whale, but in fact it’s the largest species of dolphin) and the sperm whale, the largest of the toothed whales. Their success, and especially that of rorquals, may owe much to the abundance of krill in the oceans. Some researchers have also attributed the great growth spurt of the blue whale over the past few million years to this ready supply of food. It’s been estimated that, in the southern oceans alone, the krill biomass may be as much as 500 million tonnes, twice the biomass of humans on the planet.

Of course the behaviour of humans has had a massive impact on blue whales, especially in the century of so before 1966, when they came under international protection. The Antarctic population before whaling has been estimated at between 200,000 and 300,000,  possibly as much as ten times the current population, though numbers are difficult to determine. You can’t help but wonder what would have happened to whale – and krill – populations without human depredations.

Researchers and analysts point to two main and perhaps complementary reasons for whale ginormity; the abundance of food, and the lack of restraint on size in an oceanic medium. I’ll focus on the second reason first. This presumably has to do with physics, my weakest subject, so I want to get it straight in my mind.

Allometry is the study of the size of organisms and what it means in terms of growth, behaviour, environment and other constraints and factors. Allometry helps explain how a large oxygen-breathing mammal can survive in and transport itself through its chosen medium. Whales are ‘neutrally buoyant’ – that’s to say, their body’s density is equal to the density of the water around them. This means that they don’t have to expend the energy that land animals have to in counteracting the effects of gravity – scuba divers have to learn the correct breathing underwater to achieve this neutral buoyancy. Every step we landlubbers take involves a lifting up of our bodies against the gravitational force pinning us to the earth. The endless gentle push of gravity is what makes us wrinkle and sag over our lifetime. Okay, let’s not think about that anymore. Locomotion in the water has much to do with allometric scaling, because the rate of oxygen consumption per gram body size decreases consistently with increasing body size. Other factors include shape and type of movement, which influence the laminar or turbulent flow around the organism. All of this is very complicated and can be worked out with equations – the Reynolds equation, which relates turbulence to velocity, being of prime importance, though hard to work out in nature, especially with cetaceans, who seem to break all the rules. That’s to say, there’s much about their physiology and how it’s adapted to water that we still don’t know.

Of course, aquatic mammals have to pump blood around their bodies and get air into their lungs just like land mammals. Interestingly, mammals have much the same heart-body mass ratio, whether they’re mice or elephants, land or aquatic. That of course means that the blue whale has the biggest heart of any mammal, and that also goes for a number of other organs. Scaling is much the same, for example, for lungs, and for lung capacity, and for blood, which represents around 5.5% of body mass. So, for mammals of similar form, larger ones can travel more quickly, because it requires the same expenditure of energy to move a body length. The large body length of a blue whale enables it to move great distances in search of food or for other purposes at less metabolic expense. It also enables them to dive for much longer than other cetaceans. Whales have a lower heart rate and can carry more oxygen through their bloodstream than smaller marine mammals. These are just some of the advantages of size in the oceans.

Of course, greater mass requires greater volumes of food to sustain it, but krill seems to have provided just about all a blue whale needs in that department, though it’s also partial to a class of small crustaceans called copepods, and it’s happy, too, to consume any other stray crustaceans and little fishes it catches up in its lunge dives through the krill – described recently as ‘the largest biomechenical event on earth’. Its feeding system and technique is adapted to these small but vastly numerous life forms. For all its size, a blue whale’s throat opening won’t allow it to swallow anything larger than a beach ball, yet it can eat up to 40 million krill a day. It’s jawline is huge, extending over halfway down its body, and the jaws can open to almost a ninety degree angle during lunge diving, allowing it to scoop up about 100 tonnes of krill-infested water in about ten seconds. The water is then squeezed out through the baleen with the help of its  ventral pouch and massive tongue.

So it’s understandable why the blue whale has grown to this size, which raises the question – has it ended its growth spurt? There’s a bit of an argument going on about this. Obviously the present moment is but a snapshot, and we can never be certain about where evolution is heading, but often growth spurts in species occur at a rapid clip, and then things stabilize. The blue whales are relatively recent, judged as having split from an ancestor at around 10-15 million years ago, but it may be that they grew to their present size quickly after the split. We have no way of knowing as yet, unless we find a massive blue whale fossil dating back more than 10 million years, which is unlikely. However, other ways of knowing might crop up. There’s also an argument that these rorquals have reached their limit due to feeding limitations and oxygen supply limitations. Lots of interesting research questions to ponder over.

Written by stewart henderson

August 26, 2013 at 8:02 pm