Category Archives: Health

Zombie invasion model for surviving plagues

Imagine a highly infectious, people-borne plague for which there is no immunization or ready cure, e.g. leprosy or small pox in the 1800s, or bubonic plague in the 1500s assuming that the carrier was fleas on people (there is a good argument that people-fleas were the carrier, not rat-fleas). We’ll call these plagues zombie invasions to highlight understanding that there is no way to cure these diseases or protect from them aside from quarantining the infected or killing them. Classical leprosy was treated by quarantine.

I propose to model the progress of these plagues to know how to survive one, if it should arise. I will follow a recent paper out of Cornell that highlighted a fact, perhaps forgotten in the 21 century, that population density makes a tremendous difference in the rate of plague-spread. In medieval Europe plagues spread fastest in the cities because a city dweller interacted with far more people per day. I’ll attempt to simplify the mathematics of that paper without losing any of the key insights. As often happens when I try this, I’ve found a new insight.

Assume that the density of zombies per square mile is Z, and the density of susceptible people is S in the same units, susceptible population per square mile. We define a bite transmission likelihood, ß so that dS/dt = -ßSZ. The total rate of susceptibles becoming zombies is proportional to the product of the density of zombies and of susceptibles. Assume, for now, that the plague moves fast enough that we can ignore natural death, immunity, or the birth rate of new susceptibles. I’ll relax this assumption at the end of the essay.

The rate of zombie increase will be less than the rate of susceptible population decrease because some zombies will be killed or rounded up. Classically, zombies are killed by shot-gun fire to the head, by flame-throwers, or removed to leper colonies. However zombies are removed, the process requires people. We can say that, dR/dt = kSZ where R is the density per square mile of removed zombies, and k is the rate factor for killing or quarantining them. From the above, dZ/dt = (ß-k) SZ.

We now have three, non-linear, indefinite differential equations. As a first step to solving them, we set the derivates to zero and calculate the end result of the plague: what happens at t –> ∞. Using just equation 1 and setting dS/dt= 0 we see that, since ß≠0, the end result is SZ =0. Thus, there are only two possible end-outcomes: either S=0 and we’ve all become zombies or Z=0, and all the zombies are all dead or rounded up. Zombie plagues can never end in mixed live-and-let-live situations. Worse yet, rounded up zombies are dangerous.

If you start with a small fraction of infected people Z0/S0 <<1, the equations above suggest that the outcome depends entirely on k/ß. If zombies are killed/ rounded up faster than they infect/bite, all is well. Otherwise, all is zombies. A situation like this is shown in the diagram below for a population of 200 and k/ß = .6

FIG. 1. Example dynamics for progress of a normal disease and a zombie apocalypse for an initial population of 199 unin- fected and 1 infected. The S, Z, and R populations are shown in (blue, red, black respectively, with solid lines for the zombie apocalypse, and lighter lines for the normal plague. t= tNß where N is the total popula- tion. For both models the k/ß = 0.6 to show similar evolutions. In the SZR case, the S population disap- pears, while the SIR is self limiting, and only a fraction of the population becomes infected.

Fig. 1, Dynamics of a normal plague (light lines) and a zombie apocalypse (dark) for 199 uninfected and 1 infected. The S and R populations are shown in blue and black respectively. Zombie and infected populations, Z and I , are shown in red; k/ß = 0.6 and τ = tNß. With zombies, the S population disappears. With normal infection, the infected die and some S survive.

Sorry to say, things get worse for higher initial ratios,  Z0/S0 >> 0. For these cases, you can kill zombies faster than they infect you, and the last susceptible person will still be infected before the last zombie is killed. To analyze this, we create a new parameter P = Z + (1 – k/ß)S and note that dP/dt = 0 for all S and Z; the path of possible outcomes will always be along a path of constant P. We already know that, for any zombies to survive, S = 0. We now use algebra to show that the final concentration of zombies will be Z = Z0 + (1-k/ß)S0. Free zombies survive so long as the following ratio is non zero: Z0/S0 + 1- k/ß. If Z0/S0 = 1, a situation that could arise if a small army of zombies breaks out of quarantine, you’ll need a high kill ratio, k/ß > 2 or the zombies take over. It’s seen to be harder to stop a zombie outbreak than to stop the original plague. This is a strong motivation to kill any infected people you’ve rounded up, a moral dilemma that appears some plague literature.

Figure 1, from the Cornell paper, gives a sense of the time necessary to reach the final state of S=0 or Z=0. For k/ß of .6, we see that it takes is a dimensionless time τ of 25 or to reach this final, steady state of all zombies. Here, τ= t Nß and N is the total population; it takes more real time to reach τ= 25 if N is high than if N is low. We find that the best course in a zombie invasion is to head for the country hoping to find a place where N is vanishingly low, or (better yet) where Z0 is zero. This was the main conclusion of the Cornell paper.

Figure 1 also shows the progress of a more normal disease, one where a significant fraction of the infected die on their own or develop a natural immunity and recover. As before, S is the density of the susceptible, R is the density of the removed + recovered, but here I is the density of those Infected by non-zombie disease. The time-scales are the same, but the outcome is different. As before, τ = 25 but now the infected are entirely killed off or isolated, I =0 though ß > k. Some non-infected, susceptible individuals survive as well.

From this observation, I now add a new conclusion, not from the Cornell paper. It seems clear that more immune people will be in the cities. I’ve also noted that τ = 25 will be reached faster in the cities, where N is large, than in the country where N is small. I conclude that, while you will be worse off in the city at the beginning of a plague, you’re likely better off there at the end. You may need to get through an intermediate zombie zone, and you will want to get the infected to bury their own, but my new insight is that you’ll want to return to the city at the end of the plague and look for the immune remnant. This is a typical zombie story-line; it should be the winning strategy if a plague strikes too. Good luck.

Robert Buxbaum, April 21, 2015. While everything I presented above was done with differential calculus, the original paper showed a more-complete, stochastic solution. I’ve noted before that difference calculus is better. Stochastic calculus shows that, if you start with only one or two zombies, there is still a chance to survive even if ß/k is high and there is no immunity. You’ve just got to kill all the zombies early on (gun ownership can help). Here’s my statistical way to look at this. James Sethna, lead author of the Cornell paper, was one of the brightest of my Princeton PhD chums.

you are what you eat?

The simplest understanding of this phrase is that you should eat good, healthy foods to be healthy, and that this will make you healthy in body and mind.

The author of the study published this book against GM foods simultaneously with release of his paper.

The author of this book against unhealthy foods faked his analysis to support the book.

Clearly there is some truth to this. Crazy people look crazy and often eat crazy. Even ‘normal’ people, if they eat too much are likely to become fat, lazy, and sick. There is a socio- economic effect (fat people earn less), and a physiological evidence that gut bacteria affects anxiety and depression (at least in rats). My sense here is at the diet extremes though. There is little, or no evidence to suggest you can make yourself more intelligent (or kind or good) by eating more of the right stuff, or just the right foods in just the right amounts. A better diet can make you look better, but there is a core lie at work when you extend this to imply that the real you is your body, or so tied to your body that a healthy mind can not be found in a sickly body. But most evidence is that the mind is the real you, and (following Socrates) that beautiful minds are found in sickly bodies. I’ve seen few (basically, no) healthy poets, writers, or great artists. Neither are there scientists of note (that I can recall) who lived without smoking, drinking, and any bad habits. Many creative people did drugs. George Orwell smoked cigarette, and died of TB, but wrote well to the end. There is no evidence that bad writing or thinking can be improved by health foods. Stupid is as stupid does, and many healthy people are clearly dolts.

Not that it’s always clear what constitutes good health, or what constitutes good food for health, or what constitutes a good mind. Skinny people may be admired and may earn more, but it is not clear they are healthy. Yule Gibbons, the natural food guru died young of stomach cancer. Adele Davis, another the author of “eat right to be healthy,” died of brain cancer. And Jim Fix, “the running doctor” died young of a heat attack while running. Their health foods may have killed them, and that unhealthy foods, like chocolate and coffee can be good for you. It’s likely a question of balance. While a person will feel better who dresses well, the extreme is probably no good. Very often, a person is drawn after his self-image to be the person he pretends. Show me a man who eats only vegetarian, and I’ll show you someone who sees himself as spiritual, or wants to be seen as spiritual. And that man is likely to be drawn to acting spiritual. Among the vegetarians you find Einstein, George B. Shaw, and Gandhi, people who may have been spiritual from the start, but may have been kept to spirituality from their diets. You also find Hitler: spirituality can take all sorts of forms.

Ward Sullivan in the New Yorker

Ward Sullivan in the New Yorker. People eat, drink, and dress like who they are. And people become like those they eat drink and dress like.

Choice of diet also helps select the people you run into. If you eat vegetarian, you’re likely to associate with other vegetarians, and you will likely behave like them. If you eat Chinese, Greek, or Mexican food, you’re likely to associate with these communities and behave like them. Similarly, an orthodox Jew or Moslem is tied to his community with every dinner and every purchase from the kosher or halal store.

And now we come to the bizarre science of bio-systems. Each person is a complex bio-system, with more non-human DNA than human, and more non-human cells than human. A person has a vast army of bugs on him, and a similarly vast pool of bugs within him. Recent research suggests that what we eat affects this bio-system, and through it our mental state. For whatever the mechanism, show me someone who drinks only 30 year Scotch or 40-year-old French wine, and I’ll show you a food snob. By contrast, show me someone who eats good, cheap food, and drinks good, cheap wine or Scotch (Lauder’s or Dewar’s), and I’ll show you a decent person very much like myself, a clever man who either is a man of the people or who wants to be known as one.”Dis-moi ce que tu manges, je te dirai ce que tu es.” [Tell me what you eat and I will tell you what you are].

Robert E. Buxbaum, February, 2015. My 16-year-old daughter asked me to write on this topic. Perhaps she didn’t know what it meant, or how true I thought it was, or perhaps she liked my challenges of being 16.

Statistics of death and taxes — death on tax day

Strange as it seems, Americans tend to die in road accidents on tax-day. This deadly day is April 15 most years, but on some years April 15th falls out on a weekend and the fatal tax day shifts to April 16 or 17. Whatever weekday it is, about 8% more people die on the road on tax day than on the same weekday a week earlier or a week later; data courtesy of the US highway safety bureau and two statisticians, Redelmeier and Yarnell, 2014.

Forest plot of individuals in fatal road crashes over 30 years. X-axis shows relative increase in risk on tax days compared to control days expressed as odds ratio. Y-axis denotes subgroup (results for full cohort in final row). Column data are counts of individuals in crashes. Analytic results expressed with 95% confidence intervals setting control days as referent. Results show increased risk on tax day for full cohort, similar increase for 25 of 27 subgroups, and all confidence intervals overlapping main analysis. Recall that odds ratios are reliable estimates of relative risk when event rates are low from an individual driver’s perspective.

Forest plot of individuals in fatal road crashes for the 30 years to 2008  on US highways (Redelmeier and Yarnell, 2014). X-axis shows relative increase in risk on tax days compared to control days expressed as odds ratio. Y-axis denotes subgroup (results for full cohort in final row). Column data are counts of individuals in crashes (there are twice as many control days as tax days). Analytic results are 95% confidence intervals based on control days as referent. Dividing the experimental subjects into groups is a key trick of experimental design.

To confirm that the relation isn’t a fluke, the result of well-timed ice storms or football games, the traffic death data was down into subgroups by time, age, region etc– see figure. Each groups showed more deaths than on the average of the day a week before and after.

The cause appears unrelated to paying the tax bill, as such. The increase is near equal for men and women; with alcohol and without, and for those over 18 and under (presumably those under 18 don’t pay taxes). The death increase isn’t concentrated at midnight either, as might be expected if the cause were people rushing to the post office. The consistency through all groups suggests this is not a quirk of non-normal data, nor a fluke but a direct result of  tax-day itself.Redelmeier and Yarnell suggest that stress — the stress of thinking of taxes — is the cause.

Though stress seems a plausible explanation, I’d like to see if other stress-related deaths are more common on tax day — heart attack or stroke. I have not done this, I’m sorry to say, and neither have they. General US death data is not tabulated day by day. I’ve done a quick study of Canadian tax-day deaths though (unpublished) and I’ve found that, for Canadians, Canadian tax day is even more deadly than US tax day is for Americans. Perhaps heart attack and stroke data is available day by day in Canada (?).

Robert Buxbaum, December 12, 2014. I write about all sorts of stuff. Here’s my suggested, low stress income tax structure, and a way to reduce/ eliminate income taxes: tariffs– they worked till the Civil war. Here’s my thought on why old people have more fatal car accidents per mile driven.

Change your underwear; of mites and men

The underware bomber mites make it right.

Umar, the underwear bomber.

For those who don’t know it, the underwear bomber, Umar Farook Abdulmutallab, wore his pair of explosive underwear for 3 weeks straight before trying to detonate them while flying over Detroit in 2009. They didn’t go off, leaving him scarred for life. It’s quite possible that the nasty little mites that live in underwear stopped the underwear bomber. They are a main source of US allergens too.

Dust mite, skin, and pollen seen with a light  microscope. Gimmie some skin.

Dust mite, skin, and pollen seen with a light microscope. Gimmie some skin.

If you’ve ever used an electron microscope to look at household objects, you’ll find them covered with brick-like flakes of dried out skin-cells: yours and your friends’. Each person sheds his or her skin every month, on average. The outer layer dries out and flakes off as new skin grows in behind it. Skin flakes are the single largest source of household dust, and if not for the fact that these flakes are the main food for mites, your house would be chock full of your left over skin. When sunlight shines in your window, you see the shimmer of skin-flakes hanging in the air. Under the electron microscope, the fresh skin flakes look like bricks, but mite-eaten skin flakes look irregular. Less common, but more busy are the mites.

The facial mite movie. They live on in us, about 1 per hair follicle, particularly favoring eyelashes. Whenever you shower, your shower with a friend.

The facial mite movie. They live on in us, about 1 per hair follicle, particularly favoring eyelashes. Whenever you shower, you shower with a friend.

Dry skin is mostly protein (keratin), plus cholesterol and squalene. This provides great nutrition for dust mites and their associated bacteria. In warm, damp environments, as in your underwear or mattress, these beasties multiply and eat the old skin. The average density of dust mites on a mattress is greater than 2500/gram of dust.[1]  The mites leave behind excrement and broken off mite-limbs: nasty bits that are the most common allergens in the US today.

An allergy to dust shows up as sneezing, coughing, clogged lungs, and eczema. The most effective cure is a high level of in-home hygiene; mites don’t like soap or dry air. You’ve go to mop and vacuum regularly. Clean and change your clothing, particularly your undergarments; rotate your mattresses, and shake the dust out of your bedding. Vacuuming is less-effective as a significant fraction of the nasties go through the filter and get spread around by the vacuum blower.

As it turns out, dust mites and their bacteria eat more than skin. They also eat dried body fluids, poop residue, and the particular explosive used by Umar Farook, pentaerythritol tetra nitrate, PETN (humans can eat and metabolize this stuff too — it’s an angina treatment). The mites turn PETN into less-explosive versions, plus more mites.

Mighty mites as seen with electron microscopy. They eat more than skin.

Mighty mites as seen with electron microscopy. They eat more than skin.

There are many varieties of mite living on and among us. Belly button mites, for example, and face mites as shown above (click on the image to see it move). On average, people have one facial mite per hair follicle. It’s also possible that the bomber was stopped by poor quality control engineering and not mites at all. Religion tends to be at odds with a science like quality control, and followers tend to put their faith in miracles.

Chigger turning on a dime

Chigger turning on a dime

larger than the dust mite is the chigger, shown at left. Chiggers leave visible bites, particularly along the underwear waste-band. There are larger-yet critters in the family: lice, bed bugs, crabs. Bathing regularly, and cleaning your stuff will rid yourself of all these beasties, at least temporarily. Keeping your hair short and your windows open helps too. Mites multiply in humid, warm environments. Opening the windows dries and cools the air, and blows out mite-bits that could cause wheezing. Benjamin Franklin and took air-baths too: walking around naked with the windows open, even in winter. It helped that he lived on the second floor. Other ways to minimize mite growth include sunlight, DOT (a modern version of DDT), and eucalyptus oil. At the very minimum, change your underwear regularly. It goes a long way to reduce dust embarrassing moments at the jihadist convention.

Dr. Robert E. Buxbaum, Sept 21, 2014. Not all science or life is this weird and wonderful, but a lot is, and I prefer to write about the weird and wonderful bits. See e.g. the hazards of health food, the value of sunshine, or the cancer hazard of living near a river. Or the grammar of pirates.

Marijuana, paranoia, and creativity

Many studies have shown that marijuana use and paranoid schizophrenia go together, the effect getting stronger with longer-term and heavy use. There also seems to be a relation between marijuana (pot) and creativity. The Beetles and Stones; Dylan, DuChaps, and Obama: creative musicians painters, poets and politicians, smoked pot. Thus, we can ask what causes what: do crazy, creative folks smoke pot, or does pot-smoking cause normal folks to become crazy and creative, or is there some other relationship. Dope dealers would like you to believe that pot-smoking will make you a creative, sane genius, but this is clearly false advertising. If you were not a great artist, poet, or musician before, you are unlikely to be one after a few puffs of weed.

The Freak Brothers, by Gilbert Shelton. While these boys were not improved by dope, It would be a shame to put the artist in prison for any length of time.

The Freak Brothers, by Gilbert Shelton. What’s the relationship?

When things go together, we apply inductive reasoning. There are four possibilities: A causes B (pot makes you crazy and/or creative), B causes A (crazy folks smoke pot, perhaps as self medication), A and B are caused by a third thing C (in this case, poverty culture, or some genetic mutation). Finally, it’s possible there’s no real relationship but a failure to use statistics right. If we looked at how many golf tournaments were won by people with W last names (Woods, Wilson, Watson) we might be fooled to think it’s a causal relationship. Key science tidbit: correlation does not imply causation.

The most likely option, I suspect is that some of all of the above is going on here: There is an Oxford University study that THC, the main active ingredient in pot, causes some, temporary paranoia, and another study suggests that pot smoking and paranoid insanity may be caused by the same genetics. To this mix I’d like to add another semi-random causative: that heavy metals and other toxins that are sometimes found in marijuana are the main cause of the paranoia — while being harmful to creativity.

marijuana -paranoia

Pot cultivation is easy — that’s why it’s called weed– and cultivation is often illegal, even in countries with large pot use, like Jamaica. As a result, I suspect pot is grown preferentially in places contaminated with heavy metal toxins like vanadium, cadmium, mercury, and lead. No one wants to grow something illegal on their own, good crop-land. Instead it will be grown on toxic brownfields where no one goes. Heavy metals are known to absorb in plants, and are known to have negative psychoactive properties. Inhalation of mercury is known to make you paranoid: mad as a hatter. Thus, while the pot itself may not drive you nuts, it’s possible that heavy metals and other toxins in the pot-soil may. The creativity would have to come from some other source, and would be diminished by smoking bad weed.

I suspect that creativity is largely an in-born, genetic trait that can be improved marginally by education, but I also find that creative people are necessarily people who try new things, go off the beaten path. This, I suspect, is what leads them to pot and other “drug experiments.” You can’t be creative and walk the same, standard path as everyone else. I’d expect, therefore, that in high use countries, like Jamaica, creative success is preferentially found in the few, anti-establishment folks who eschew it.

Robert E. (landslide) Buxbaum, September 4, 2014. The words pot, marijuana, dope, and weed all mean the same but appear in different cultural contexts. To add to the confusion, Jamaicans refer to pot as ganja or skiff, and their version of paranoid schizophrenia is called “ganja psychosis”. I’m not anti-pot, but favor government regulation— perhaps along the lines of beer regulation, or perhaps the stricter regulation of Valium. My most recent essay was on the tension-balance between personal freedom and government control. I was recently elected in Oak Park’s 3rd voting district. My slogan: “A Chicken in every pot, not pot in every chicken”. I won by one vote. For those who are convinced they’ve become really deep, creative types without having to create anything, let me suggest the following cartoon about talent. Also, if pot made you smart, Jamaica would be floating in Einsteins.

In praise of openable windows and leaky construction

It’s summer in Detroit, and in all the tall buildings the air conditioners are humming. They have to run at near-full power even on evenings and weekends when the buildings are near empty, and on cool days. This would seem to waste a lot of power and it does, but it’s needed for ventilation. Tall buildings are made air-tight with windows that don’t open — without the AC, there’s be no heat leaving at all, no way for air to get in, and no way for smells to get out.

The windows don’t open because of the conceit of modern architecture; air tight building are believed to be good design because they have improved air-conditioner efficiency when the buildings are full, and use less heat when the outside world is very cold. That’s, perhaps 10% of the year. 

No openable windows, but someone figured you should suffer for art

Modern architecture with no openable windows. Someone wants you to suffer for his/her art.

Another reason closed buildings are popular is that they reduce the owners’ liability in terms of things flying in or falling out. Owners don’t rain coming in, or rocks (or people) falling out. Not that windows can’t be made with small openings that angle to avoid these problems, but that’s work and money and architects like to spend time and money only on fancy facades that look nice (and are often impractical). Besides, open windows can ruin the cool lines of their modern designs, and there’s nothing worse, to them, than a building that looks uncool despite the energy cost or the suffering of the inmates of their art.

Most workers find sealed buildings claustrophobic, musty, and isolating. That pain leads to lost productivity: Fast Company reported that natural ventilation can increase productivity by up to 11 percent. But, as with leading clothes stylists, leading building designers prefer uncomfortable and uneconomic to uncool. If people in the building can’t smell an ocean breeze, or can’t vent their area in a fire (or following a burnt burrito), that’s a small price to pay for art. Art is absurd, and it’s OK with the architect if fire fumes have to circulate through the entire building before they’re slowly vented. Smells add character, and the architect is gone before the stench gets really bad. 

No one dreams of working in an unventilated glass box.

No one dreams of working in a glass box. If it’s got to be an office, give some ventilation.

So what’s to be done? One can demand openable windows and hope the architect begrudgingly obliges. Some of the newest buildings have gone this route. A simpler, engineering option is to go for leaky construction — cracks in the masonry, windows that don’t quite seal. I’ve maintained and enlarged the gap under the doors of my laboratory buildings to increase air leakage; I like to have passive venting for toxic or flammable vapors. I’m happy to not worry about air circulation failing at the worst moment, and I’m happy to not have to ventilate at night when few people are here. To save some money, I increase the temperature range at night and weekends so that the buildings is allowed to get as hot as 82°F before the AC goes on, or as cold as 55°F without the heat. Folks who show up on weekends may need a sweater, but normally no one is here. 

A bit of air leakage and a few openable windows won’t mess up the air-conditioning control because most heat loss is through the walls and black body radiation. And what you lose in heat infiltration you gain by being able to turn off the AC circulation system when you know there are few people in the building (It helps to have a key-entry system to tell you how many people are there) and the productivity advantage of occasional outdoor smells coming in, or nasty indoor smells going out.

One irrational fear of openable windows is that some people will not close the windows in the summer or in the dead of winter. But people are quite happy in the older skyscrapers (like the empire state building) built before universal AC. Most people are nice — or most people you’d want to employ are. They will respond to others feelings to keep everyone comfortable. If necessary a boss or building manager may enforce this, or may have to move a particularly crusty miscreant from the window. But most people are nice, and even a degree of discomfort is worth the boost to your psyche when someone in management trusts you to control something of the building environment.

Robert E. Buxbaum, July 18, 2014. Curtains are a plus too — far better than self-darkening glass. They save energy, and let you think that management trusts you to have power over your environment. And that’s nice.

US cancer rates highest on the rivers, low in mountains, desert

Sometimes I find I have important data that I can’t quite explain. For example, cancer rates in the US vary by more than double from county to county, but not at random. The highest rates are on the rivers, and the lowest are in the mountains and deserts. I don’t know why, but the map shows it’s so.

Cancer rate map of the US age adjusted

Cancer death rates map of the US age adjusted 2006-2010, by county. From www.statecancerprofiles.cancer.gov.

Counties shown in red on the map have cancer death rates between 210 and 393 per 100,000, more than double, on average the counties in blue. These red counties are mostly along the southern Mississippi, the Arkansas branching to its left; along the Alabama, to its right, and along the Ohio and the Tennessee rivers (these rivers straddle Kentucky). The Yukon (Alaska) shows up in bright red, while Hawaii (no major rivers) is blue; southern Alaska (mountains) is also in blue. In orange, showing less-elevated cancer death, you can make out the Delaware river between NJ and DC, the Missouri heading Northwest from the Mississippi, the Columbia, and the Colorado between the Grand Canyon and Las Vegas. For some reason, counties near the Rio Grande do not show elevated cancer death rates. nor does the Northern Mississippi and the Colorado south of Las Vegas.

Contrasting this are areas of low cancer death, 56 to 156 deaths per year per 100,000, shown in blue. These appear along the major mountain ranges: The Rockies (both in the continental US and Alaska), the Sierra Nevada, and the Appalachian range. Virtually every mountain county appears in blue. Desert areas of the west also appear as blue, low cancer regions: Arizona, New Mexico, Utah, Idaho, Colorado, south-west Texas and southern California. Exceptions to this are the oasis areas in the desert: Lake Tahoe in western Nevada and Lake Meade in southern nevada. These oases stand out in red showing high cancer-death rates in a sea of low. Despite the AIDS epidemic and better health care, the major cities appear average in terms of cancer. It seems the two effects cancel; see the cancer incidence map (below).

My first thought of an explanation was pollution: that the mountains were cleaner, and thus healthier, while industry had polluted the rivers so badly that people living there were cancer-prone. I don’t think this explanation fits, quite, since I’d expect the Yukon to be pollution free, while the Rio Grande should be among the most polluted. Also, I’d expect cities like Detroit, Cleveland, Chicago, and New York to be pollution-heavy, but they don’t show up for particularly high cancer rates. A related thought was that specific industries are at fault: oil, metals, chemicals, or coal, but this too doesn’t quite fit: Utah has coal, southern California has oil, Colorado has mining, and Cleveland was home to major Chemical production.

Another thought is poverty: that poor people live along the major rivers, while richer, healthier ones live in the mountains. The problem here is that the mountains and deserts are home to some very poor counties with low cancer rates, e.g. in Indian areas of the west and in South Florida and North Michigan. Detroit is a very poor city, with land polluted by coal, steel, and chemical manufacture — all the worst industries, you’d expect. We’re home to the famous black lagoon, and to Zug Island, a place that looks like Hades when seen from the air. The Indian reservation areas of Arizona are, if anything, poorer yet. 

Cancer incidence map

Cancer incidence,age adjusted, from statecancerprofiles.cancer.gov

My final thought was that people might go to the river to die, but perhaps don’t get cancer by the river. To check this explanation, I looked at the map of cancer incidence rates. While many counties repress their cancer rate data, the pattern in the remaining ones is similar to that for cancer death: the western mountain and desert counties show less than half the incidence rates of the counties along the southern Mississippi, the Arkansas, and the Ohio rivers. The incidence rates are somewhat elevated in the north-east, and lower on the Yukon, but otherwise it’s the same map as for cancer death. Bottom line: I’m left with an observation of the cancer pattern, but no good explanation or model.

Dr. Robert E. Buxbaum, May 1, 2014. Two other unsolved mysteries I’ve observed: the tornado drought of the last few years, and that dilute toxins and radiation may prevent cancer. To do science, you first observe, and then try to analyze.

Is ADHD a real disorder

When I was in school, ADHD hadn’t been invented. There were kids who didn’t pay attention for a good part of the day, or who couldn’t sit in their seats, but the first activity was called day-dreaming and the second “shpilkas” or “ants in your pants.” These problems were recognized but were considered “normal.” Though we were sometimes disorderly, the cause wasn’t labeled a disorder. It’s now an epidemic.

There were always plenty of kids, me included, who were day-dreamers. Mostly these were boys who would get bored after a while and would start to look around the room, or doodle, or gaze into space thinking of this or that. Perhaps I’d do some writing or math in the margin of a notebook while listening with one ear; perhaps I’d work on my handwriting, or I’d read something in another textbook. This was not called a disorder or even an attention deficit (AD), but rather day-dreaming, wool-gathering, napping, or just not paying attention. Sometimes teachers got annoyed, other times not. They went on teaching, but sometimes tossed chalk or erasers at us to get us to wake up. Kids like me took enough notes to do OK on tests and homework, though I was never at the top of the class in elementary or middle school. The report cards tended to say things like “he could do better if he really concentrated.”  It’s something that could apply to everyone.

Then there were the boys who would now be labeled HD, or “hyperactive disordered.” These were always boys: those who didn’t sit well in their chairs, or fidgeted, or were motor mouths and got up and walked about, or got into fights, or went to the bathroom; these were the class clowns, and the trouble makers — not me except for the fidgeting. Girls would fidget or talk too, and they’d pass notes to each other, but they didn’t get into fights, and they weren’t as disruptive. They tended to have great handwriting, and took lots of notes in class: every single word from the board, plus quite a bit more.

There are different measures of education, if you measure a fish's intelligence by the ability to climb a tree it will spend its life thinking it's stupid.

There are different measures of education, if you measure a fish’s skill level by the ability to climb a tree you’ll conclude the fish is ADD or worse.

Elementary and middle schools had activities to work out the excess energy that caused hyper-activity. We had dancing, shop, fire drills, art, some music, and sports. None of these helped all that much, but they did some good. I think the fire drills helped the most because we all went outside even in the winter, and eventually we calmed down without drugs. Sometimes a kid didn’t calm down, got worse, and did real damage; these kids were not called hyperactive disordered, but “bad kids” or “juvenile delinquents.” Nowadays, schools have far less art and music, and no shop or dancing. There are a lot more hyperactive kids, and the claim nowadays is that these hyperactive kids, violent or not, are disordered, ADHD, and should be given drugs. With drugs, the daydreamers take better notes, the nappers wake up, and the hyperactive kids calm down. Today about 30% of high-school seniors are given either a version of amphetamine, e.g. Adderall, or of Methylphenidate (Ritalin, etc.) The violent ones, the juvenile delinquents, are given stronger versions of the same drugs, e.g. methamphetamine, the drug at the heart of “breaking bad.”

Giving drugs to the kids seems to help the teacher a lot more than it helps the kids. According to a famous joke, giving the Ritalin to the teacher would be the best solution. When the kids are given drugs the disorderly boys (it’s usually given to boys) begin to act more like “goodie goodies”. They sit better and pay attention more; they take better notes and don’t interrupt, but I’m not sure they are learning more, or that the class is, or that they are socializing any better than before. The “goodie-goodies” in elementary school (mostly girls) did great in the early grades, but their good habits seemed to hold them back later. They worked too hard to please and tended to not notice, or pretended to not notice, when the teacher said nonsense. When it came time for independent or creative endeavors, their diligent acceptance of authority stood in the way of excellence.Venn diagram of ADHD

The hyperactive and daydreamers were more used to thinking for themselves, a prerequisite of leadership. The AD ones had gotten used to half-ignoring the teacher, and the HD ones were more openly opinionated and oppositional: obstreperous, in a word. Those bright enough to get by got more out of their education, perhaps because it was more theirs. To the extent that education was supposed to make you a leader and a thinker, the goodie-goodie behavior was a distraction and a disorder. This might be expected if education is supposed to be the lighting of a fire, not the filling of a pit. If everyone thinks the same, it’s a sign that few are thinking.

Map  of ADHD variation with location for US kids ages 6-18, Scrips Research.

Map of ADHD variation with location for US kids ages 6-18, Scrips Research. Boys are 2-3 times more often diagnosed as ADHD; diagnosis and medication increase with grade, peaking currently in early college.

This is not to say that there is no such disorder as ADHD, or no benefit from the drugs. My sense, though, is that the label is given too widely, and that the drugs are given too freely. Today drugs are pushed on virtually any kid who’s distracted, napping or hyperactive — to all the members of the big circles in the Venn diagram above, plus to athletes and others who feign ADD to get these, otherwise illegal, performance enhancing drugs. Currently, about 10% of US kids between 6 and 18 are diagnosed ADHD and given drugs, see figure. The numbers higher for boys than girls, higher in the US than abroad, and higher as the kids progress through school. It’s estimated that about 25% of US, 12th grade boys are given amphetamine or Ritalin and its homologs. My sense is that only a small fraction of these deserve drugs, only those with severe social problems, the violent or narcoleptic: those in the smaller circles of the Venn diagram. The test should not be that the kid’s behavior improves on them. Everyone’s attention improves when taking speed. ADHD appears more as an epidemic of overworked, undertrained, underfunded teachers, and a lack of outlets, not of disordered kids, or of real learning, and real learning is never pretty or easy (on all involved).

Robert Buxbaum, April 18, 2014. In general, I think people would be happier if they’d do more artmusicdance and shop, and if they’d embrace their inner weirdo. It would also help if doctors and teachers would use words rather than initials to describe people. It’s far better to be told you’re hyperactive, or that you’re not paying attention, then to be called ADD, HD, or ADHD. There’s far more room for gradation and improvement. I’m not an expert, just an observant observer.

Genetically modified food not found to cause cancer.

It’s always nice when a study is retracted, especially so if the study alerts the world to a danger that is found to not exist. Retractions don’t happen often enough, I think, given that false positives should occur in at least 5% of all biological studies. Biological studies typically use 95% confidence limits, a confidence limit that indicates there will be false positives 5% of the time for the best-run versions (or 10% if both 5% tails are taken to be significant). These false positives will appear in 5-10% of all papers as an expected result of statistics, no matter how carefully the study is done, or how many rats used. Still, one hopes that researchers will check for confirmation from other researchers and other groups within the study. Neither check was not done in a well publicized, recent paper claiming genetically modified foods cause cancer. Worse yet, the experiment design was such that false positives were almost guaranteed.

Séralini published this book, “We are all Guinea Pigs,” simultaneously with the paper.

As reported in Nature, the journal Food and Chemical Toxicology retracted a 2012 paper by Gilles-Eric Séralini claiming that eating genetically modified (GM) maize causes cancerous tumors in rats despite “no evidence of fraud or intentional misrepresentation.” I would not exactly say no evidence. For one, the choice of rats and length of the study was such that a 30% of the rats would be expected to get cancer and die even under the best of circumstances. Also, Séralini failed to mention that earlier studies had come to the opposite conclusion about GM foods. Even the same journal had published a review of 12 long-term studies, between 90 days and two years, that showed no harm from GM corn or other GM crops. Those reports didn’t get much press because it is hard to get excited at good news, still you’d have hoped the journal editors would demand their review, at least, would be referenced in a paper stating the contrary.

A wonderful book on understanding the correct and incorrect uses of statistics.

A wonderful book on understanding the correct and incorrect uses of statistics.

The main problem I found is that the study was organized to virtually guarantee false positives. Séralini took 200 rats and divided them into 20 groups of 10. Taking two groups of ten (one male, one female) as a control, he fed the other 18 groups of ten various doses of genetically modified grain, either alone of mixed with roundup, a pesticide often used with GM foods. Based on pure statistics, and 95% confidence, you should expect that, out of the 18 groups fed GM grain there is a 1- .9518 chance (60%) that at least one group will show cancer increase, and a similar 60% chance that at least one group will show cancer decrease at the 95% confidence level. Séralini’s study found both these results: One group, the female rats fed with 10% GM grain and no roundup, showed cancer increase; another group, the female rats fed 33% GM grain and no roundup, showed cancer decrease — both at the 95% confidence level. Séralini then dismissed the observation of cancer decrease, and published the inflammatory article and a companion book (“We are all Guinea Pigs,” pictured above) proclaiming that GM grain causes cancer. Better editors would have forced Séralini to acknowledge the observation of cancer decrease, or demanded he analyze the data by linear regression. If he had, Séralini would have found no net cancer effect. Instead he got to publish his bad statistics, and (since non of the counter studies were mentioned) unleashed a firestorm of GM grain products pulled from store shelves.

Did Séralini knowingly design a research method aimed to produce false positives? In a sense, I’d hope so; the alternative is pure ignorance. Séralini is a long-time, anti GM-activist. He claims he used few rats because he was not expecting to find any cancer — no previous tests on GM foods had suggested a cancer risk!? But this is mis-direction; no matter how many rats in each group, if you use 20 groups this way, there is a 60% chance you’ll find at least one group with cancer at the 95% confidence limit. (This is Poisson-type statistics see here). My suspicion is that Séralini knowingly gamed the experiments in an effort to save the world from something he was sure was bad. That he was a do-gooder twisting science for the greater good.

The most common reason for retraction is that the article has appeared elsewhere, either as a substantial repeat from the authors, or from other authors by plagiarism or coincidence. (BC Comics, by Johnny Hart, 11/25/10).

It’s important to cite previous work and aspects of the current work that may undermine the story you’d like to tell; BC Comics, Johnny Hart.

This was not the only major  retraction of the month, by the way. The Harrisburg Patriot & Union retracted its 1863 review of Lincoln’s Gettysburg Address, a speech the editors originally panned as “silly remarks”, deserving “a veil of oblivion….” In a sense, it’s nice that they reconsidered, and “…have come to a different conclusion…” My guess is that the editors were originally motivated by do-gooder instinct; they hoped to shorten the war by panning the speech.

There is an entire blog devoted to retractions, by the way:  http://retractionwatch.com. A good friend, Richard Fezza alerted me to it. I went to high school with him, then through under-grad at Cooper Union, and to grad school at Princeton, where we both earned PhDs. We’ll probably end up in the same old-age home. Cooper Union tried to foster a skeptical attitude against group-think.

Robert Buxbaum, Dec 23, 2013. Here is a short essay on the correct way to do science, and how to organize experiments (randomly) to make biassed analysis less likely. I’ve also written on nearly normal statistics, and near poisson statistics. Plus on other random stuff in the science and art world: Time travel, anti-matter, the size of the universe, Surrealism, Architecture, Music.

Murder rate in Finland, Japan higher than in US

The murder rate in Finland and Japan is higher than in the US if suicide is considered as a type of murder. In the figure below, I’ve plotted total murder rates (homicide plus suicide) for several developed-world countries. The homicide component is in blue, with the suicide rate above it, in green. In terms of this total, the US is seen to be about average among the developed counties. Mexico has the highest homicide rate for those shown, Japan has the highest suicide rate, and Russia has this highest total murder rate shown (homicide + suicide): nearly double that of the US and Canada. In Russia and Japan, some .02% of the population commit suicide every year. The Scandinavian countries are quite similar to the US, and Japan, and Mexico are far worse. Italy, Greece and the UK are better than the US, both in terms of low suicide rate and low homicide rate.

  Combined homicide and suicide rates for selected countries, 2005.


Homicide and suicide rates for selected countries, 2005 Source: Wikipedia.

In the US, pundants like Piers Morgan like to use our high murder rate as an indicator of the ills of American society: loose gun laws are to blame, they say, along with the lack of social welfare safety net, a lack of support for the arts, and a lack of education and civility in general. Japan, Canada, and Scandinavia are presented as near idyls, in these regards. When murder is considered to include suicide though, the murder-rate difference disappears. Add to this, that violent crime rates are higher in Europe, Canada, and the UK, suggesting that clean streets and education do not deter crime.

The interesting thing though is suicide, and what it suggests about happiness. According to my graphic, the happiest, safest countries appear to be Italy and Greece. Part of this is likely weather , people commit suicide more in cold countries, but another part may be that some people (malcontents?) are better served by dirty, noisy cafés and pubs where people meet and complain, and are not so well served by clean streets and civility. It’s bad enough to be a depressed outsider, but it’s really miserable if everything around you is clean, and everyone is polite but busy.

Yet another thought about the lower suicide rates in the US and Mexico, is that some of the homicide in these countries is really suicide by proxy. In the US and Mexico depressed people (particularly men) can go off to war or join gangs. They still die, but they die more heroically (they think) by homicide. They volunteer for dangerous army missions or to attack a rival drug-lord outside a bar. Either they succeed in killing someone else, or they’re shot dead. If you’re really suicidal and can’t join the army, you could move to Detroit; the average house sold for $7100 last year (it’s higher now, I think), and the homicide rate was over 56 per 100,000. As bad as that sounds, it’s half the murder rate of Greenland, assuming you take suicide to be murder.

R.E. Buxbaum, Sept 14, 2013