I Have a Hard Time Summing Up My Science and Politics Beliefs Into a Slogan

From a half-joking, half-serious post of my own on Facebook:

“SCIENCE IS POLITICAL BECAUSE THERE’S LOT OF INFLUENCE BY POLITICAL AND POWERFUL CULTURAL INSTITUTIONS, BUT NOT PARTISAN. AND ALSO THAT SCIENTIFIC RESULTS AFFECT MORE OF OURS LIVES. BUT LIKE MAN, WE REALLY SHOULDN’T DO THE WHOLE TECHNOCRACY THING. BUT LIKE EVIDENCE SHOULD MATTER. BUT ALSO VALUES MATTER WHEN EVALUATING STUFF. IT’S COMPLICATED. HAS ANYONE READ LATOUR? OR FEYERABEND? CAN SOMEONE EXPLAIN FEYERABEND TO ME? DOES ANYONE WANT TO GET DRINKS AND TALK AFTER THIS?”

the_end_is_not_for_a_while

Evidently, I am the alt-text from this comic.

“HERE ARE SOME GOOD ARTICLES ABOUT PHILOSOPHY AND SOCIOLOGY OF SCIENCE” (I didn’t actually give a list, since I knew I would never really be able to put that on a poster, but some suggested readings if you’re interested: the Decolonizing Science Reading List curated by astrophysicist Chanda Prescod-Weinstein, a recent article from The Atlantic about the March for Science, a perspective on Doing Science While Black, the history of genes as an example of the evolution of scientific ideas, honestly there’s a lot here, and this is just stuff I shared on my Facebook page over the last few months.)
“LIKE HOLY SHIT Y’ALL EUGENICS HAPPENED”
“LIKE, MAN, WE STERILIZED A LOT OF PEOPLE. ALSO, EVEN BASIC RESEARCH CAN BE MESSED UP. LIKE TUSKEGEE. OR LITERALLY INJECTING CANCER INTO PEOPLE TO SEE WHAT HAPPENS. OR CRISPR. LIKE, JEEZ, WHAT ARE WE GOING TO DO WITH THAT ONE? SOCIETY HELPS DETERMINE WHAT IS APPROPRIATE.”
“I FEEL LIKE I’M GOING OFF MESSAGE. BUT LIKE WHAT EXACTLY IS THE MESSAGE HERE”
“I DON’T KNOW WHAT THE MESSAGE IS, BUT THESE ARE PROBABLY GOOD TO DO. ESPECIALLY IF THEY INSPIRE CONVERSATIONS LIKE THIS.”
“ALSO, DID YOU KNOW THAT MULTICELLULAR LIFE INDEPENDENTLY EVOLVED AT LEAST 10 TIMES ON EARTH? I’M NOT GOING ANYWHERE WITH THAT, I JUST THINK IT’S NEAT AND WE DON’T TYPICALLY HEAR THAT IN INTRO BIO.”
Advertisements

Lynn Conway, Enabler of Microchips

Are you using something with a modern microprocessor on International Women’s Day? (If you’re not, but somehow able to see this post, talk to a doctor. Or a psychic.) You should thank Dr. Lynn Conway, professor emerita of electrical engineering and computer science at Michigan and member of the National Academy of Engineering, who is responsible for two major innovations that are ubiquitous in modern computing. She is most famous for the Mead-Conway revolution, as she developed the “design rules” that are used in Very-Large-Scale Integration architecture, the scheme that basically underlies all modern computer chips. Conway’s rules standardized chip design, making the process faster, easier, and more reliable, and perhaps most significant to broader society, easy to scale down, which is why we are now surrounded by computers.

sscm_cover_page_m

She is less known for her work on dynamic instruction scheduling (DIS). DIS lets a computer program operate out of order, so that later parts of code that do not depend on results of earlier parts can start running instead of letting the whole program stall until certain operations finish. This lets programs run faster and also be more efficient with processor and memory resources. Conway was less known for this work for years because she presented as a man when she began work at IBM. When Conway began her public transition to a woman in 1968, she was fired because the transition was seen as potentially “disruptive” to the work environment. After leaving IBM and completing her transition, Conway lived in “stealth”, which prevented her from publicly taking credit for her work there until the 2000s, when she decided to reach out to someone studying the company’s work on “superscalar” computers in the 60s.

Since coming out, Dr. Conway has been an advocate for trans rights, in science and in society. As a scientist herself, Dr. Conway is very interested in how trans people and the development of gender identity are represented in research. In 2007, she co-authored a paper showing that mental health experts seemed to be dramatically underestimating the number of trans people in the US based just on studies of transition surgeries alone. In 2013 and 2014, Conway worked to make the IEEE’s Code of Ethics inclusive of gender identity and expression.

A good short biography of Dr. Conway can be found here. Or read her writings on her website.

Weirdly Specific Questions I Want Answers to in Meta-science, part 1

Using “meta-science” as a somewhat expansive term for history, philosophy, and sociology of science. And using my blog as a place to write about something besides the physical chemistry of carbon nanomaterials in various liquids.

  • To what extent is sloppy/misleading terminology an attempt to cash in on buzzwords? Clearly, we know that motive exists – there aren’t two major papers trying to narrow down precise definitions of graphene-related terms for nothing. But as the papers also suggest, at what point is it a legitimate debate in the community about setting a definition? “Graphene” was a term that described a useful theoretical construct for decades before anyone ever thought someone could make a real sheet of it, so maybe it isn’t unreasonable that people started using to describe a variety of physical things related to the original idea.
    • This contains a sort of follow-up: What properties do people use in clarifying these definitions and how much does it vary by background? Personally, I would say I’m way closer to the ideal of “graphene” than lots of people working with more extensively chemically modified graphene derivatives and am fine with using it for almost anything that’s nearly all sp2 carbon with about 10 layers or less. But would a physicist who cares more about the electronic properties, and which vary a lot based on the number of layers even in the lower limit, consider that maddening?
  • Nanoscience is very interdisciplinary/transdisciplinary, but individual researchers can be quite grounded in just one field. How much work is being done where researchers are missing basic knowledge of another field their work is now straddling?
    • For instance, when reading up on polymer nanocomposites, it seems noted by lots of people with extensive polymer science backgrounds that there are many papers that don’t refer to basic aspects of polymer physics. My hunch is that a lot of this comes from the fact that many people in this field started working on the nanoparticles they want to incorporate into the composites and then moved into the composites. They may have backgrounds more in fields like solid-state physics, electrical engineering, or (inorganic/metallic/ceramic) materials science, where they would have been less likely to deal with polymer theory.
    • Similarly, it was noted in one paper I read that a lot of talk about solutions of nanoparticles probably would be more precise if the discussion was framed in terminology of colloids and dispersions.
51cb2b3noc-l-_sx346_bo1204203200_

Oh my gosh, I made fun of the subtitle for like two years, but it’s true

  • Is the ontological status of defects in nanoscience distinct from their treatment in bulk studies of materials? This is a bit related to the first question in that some definitions would preclude the existence of some defects in the referent material/structure.
    • On the other hand, does this stricter treatment make more sense in the few atom limit of many nanomaterials? Chemists can literally specify the type and location of every atom in successful products of well-studied cluster reactions, though these are even pushing the term “nano”.
    • Is this a reflection of applications of defects at the different scales? (More philosophically worded, are defects treated differently because of their teleological nature?) At the bulk level, we work to engineer the nature of defects to help develop the properties we want. At the nanoscale, some structures can basically be ruined for certain applications by the mislocation of a single atom. Is this also a reflection of the current practical process of needing to scale up the ability to make nanomaterials? E.g. as more realistic approaches to large-scale nanotech fabrication are developed, will the practical treatment of defects in nanomaterials converge to that of how we treat defects in the bulk?

Reclaiming Science as a Liberal Art

What do you think of when someone talks about the liberal arts? Many of you probably think of subjects like English and literature, history, classics, and philosophy. Those are all a good start for a liberal education, but those are only fields in the humanities. Perhaps you think of the social sciences, to help you understand the institutions and actors in our culture; fields like psychology, sociology, or economics. What about subjects like physics, biology, chemistry, or astronomy? Would you ever think of them as belonging to the liberal arts, or would you cordon them off into the STEM fields? I would argue that excluding the sciences from the liberal arts is both historically wrong and harms society.

First, let’s look at the original conception of the liberal arts. Your study would begin with the trivium, the three subjects of grammar, logic, and rhetoric. The trivium has been described as a progression of study into argument. Grammar is concerned with how things are symbolized. Logic is concerned with how things are understood. Rhetoric is concerned with how things are effectively communicated, because what good is it to understand things if you cannot properly share your understanding to other learned people? With its focus on language, the trivium does fit the common stereotype of the liberal arts as a humanistic writing education.

But it is important to understand that the trivium was considered only the beginning of a liberal arts education. It was followed by the supposedly more “serious” quadrivium of arithmetic, geometry, music, and astronomy. The quadrivium is focused on number and can also be viewed as a progression. Arithmetic teaches you about pure numbers. Geometry looks at number to describe space. Music, as it was taught in the quadrivium, focused on the ratios that produce notes and the description of notes in time. Astronomy comes last, as it builds on this knowledge to understand the mathematical patterns in space and time of bodies in the heavens. Only after completing the quadrivium, when one would have a knowledge of both language and numbers, would a student move on to philosophy or theology, the “queen of the liberal arts”.

7 Liberal Arts

The seven liberal arts surrounding philosophy.

Although this progression might seem strange to some, it makes a lot of sense when you consider that science developed out of “natural philosophy”. Understanding what data and observations mean, whether they are from a normal experiment or “big data”, is a philosophical activity. As my professors say, running an experiment without an understanding of what I was measured makes me a technician, not a scientist. Or consider alchemists, who included many great experimentalists who developed some important chemical insights, but are typically excluded from our conception of science because they worked with different philosophical assumptions. The findings of modern science also tie into major questions that define philosophy. What does it say about our place in the universe if there are 10 billion planets like Earth in our galaxy, or when we are connected to all other living things on Earth through chemistry and evolution?

We get the term liberal arts from Latin, artes liberales, the arts or skills that are befitting of a free person. The children of the privileged would pursue those fields. This was in contrast to the mechanical arts – fields like clothesmaking, agriculture, architecture, martial arts, trade, cooking, and metalworking. The mechanical arts were a decent way for someone without status to make a living, but still considered servile and unbecoming of a free (read “noble”) person. This distinction breaks down in modern life because we are no longer that elitist in our approach to liberal education. We think everyone should be “free”, not just an established elite.

More importantly, in a liberal democracy, we think everyone should have some say in how they are governed. Many major issues in modern society relate to scientific understanding and knowledge. To talk about vaccines, you need to have some understanding of the immune system. The discussion over chemicals is very different when you know that we are made up chemicals. It is hard to understand what is at stake in climate change without a knowledge of how Earth’s various geological and environmental systems work and it is hard to evaluate solutions if you don’t know where energy comes from. Or how can we talk about surveillance without understanding how information is obtained and how it is distributed? The Founding Fathers say they had to study politics and war to win freedom for their new nation. As part of a liberal education, Americans today need to learn to science in order to keep theirs.

(Note: This post is based off a speech I gave as part of a contest at UVA. It reflects a view I think is often unconsidered in education discussions, so I wanted to adapt it into a blog post.

As another aside, it’s incredibly interesting people now tend to unambiguously think of social sciences as part of the liberal arts while wavering more on the natural sciences since the idea of a “social” science wasn’t really developed until well after the conception of the liberal arts.)

Thoughts on Basic Science and Innovation

Recently, science writer and House of Lords member Matt Ridley wrote an essay in The Wall Street Journal about the myth of basic science leading to technological development. Many people have criticized it, and it even seems like Ridely has walked back some claims. One engaging take can be found here, which includes a great quote that I think helps summarize a lot of the reaction:

I think one reason I had trouble initially parsing this article was that it’s about two things at once. Beyond the topic of technology driving itself, though, Ridley has some controversial things to say about the sources of funding for technological progress, much of it quoted from Terence Kealy, whose book The Economic Laws of Scientific Research has come up here before

But also, it seems like Ridley has a weird conception of the two major examples he cites as overturning the “myth”: steam engines and the structure of the DNA. The issue with steam engines is that we mainly associate them with James Watt, who you memorialize everytime you fret about how many watts all your devices are consuming. Steam engines actually preceded Watt, but the reason we associate them with him is because he greatly improved their efficiency due to his understanding of latent heat, the energy that goes into changing something from one phase into another. (We sort of discussed this before. The graph below helps summarize.) Watt understood latent heat because his colleague and friend Joseph Black, a chemist at the University of Glasgow, discovered it.

A graph with X-axis labelled

The latent heat is the heat that is added to go between phases. In this figure, it is represented by the horizontal line between ice and heating of water and the other horizontal line between heating of water and heating of water vapor.

I don’t know whether or not X-ray crystallography was ever used for industrial purposes in the textiles industry, but it has pretty consistently been used in academia since the basic principles were discovered a century ago. The 1915 Nobel prize in physics was literally about the development of X-ray crystallograpy theory. A crystal structure of a biological molecule was determined by X-ray studies at least in 1923, if not earlier. The idea that DNA crystallography only took off as a popular technique because of spillover from industry is incredibly inaccurate.

Ridley also seems to have a basic assumption: that government has crowded out the private sector as a source of basic research over the past century. It sounds reasonable, and it seems testable as a hypothesis. As a percentage of GDP (which seems like a semi-reasonable metric for concerns about crowding out), federal spending on research and development has generally been on the decline since the 70s, and is now about a 1/3 less than it’s relatively stable levels that decade. If private R&D had been crowded out, a competitor dropping by that much seems like a decent place for some resurgence, especially since the one of the most cited examples of private research, Bell Labs, was still going strong all this time. But instead, Bell cut most of its basic research programs just a few years ago.

Fedederal research spending as a percentage of GDP from the 1976 to 2016 fiscal years. The total shows a slight decrease from 1976 to 1992, a large drop in the 90s, a recovery in 2004, and a drop since 2010.

Federal spending on R&D as a percentage of GDP over time

To be fair, more private philanthropists now seem to be funding academic research. The key word, though, is philanthropist, not commercial, which Ridley refers to a lot throughout the essay. Also, a significant amount of this new private funding is for prizes, but you can only get a prize after you have done work.

There is one major thing to take from Ridley’s essay, though I also think most scientists would admit it too. It’s somewhat ridiculous to try to lay out a clear path from a new fundamental result to a practical application, and if you hear a researcher claim to have one, keep your BS filter high. As The New Yorker has discussed, even results that seems obviously practical have a hard time clearing feasibility hurdles. (Also, maybe it’s just a result of small reference pools, but it seems like a lot of researchers I read are also concerned that research now seems to require some clear “mother of technology” justification.)  Similarly, practical developments may not always be obvious. Neil deGrasse Tyson once pointed out that if you spoke to a physicist in the late 1800s about the best way to quickly heat something, they would probably not describe something resembling a microwave.

Common timeframe estimates of when research should result in a commercially available product, followed by translations suggesting how unrealistic this is. The fourth quarter of next year The project will be canceled in six months. Five years I've solved the interesting research problems. The rest is just business, which is easy, right? Ten years We haven't finished inventing it yet, but when we do, it'll be awesome. 25+ years It has not been conclusively proven impossible. We're not really looking at market applications right now. I like being the only one with a hovercar.

Edit to add: Also, I almost immediately regret using innovation in the title because I barely address it in the post, and there’s probably a great discussion to have about that word choice by Ridley. Apple, almost famously, funds virtually no basic research internally or externally, which I often grumble about. However, I would not hesitate to call Apple an “innovative” company. There are a lot of design choices that can improve products that can be pretty divorced from the physical breakthroughs that made them unique. (Though it is worth pointing out human factors and ergonomics are very active fields of study in our modern, device-filled lives.)

Red Eye Take Warning – Our Strange, Cyclical Awareness of Pee in Pools

The news has been abuzz lately with a terrifying revelation: if you get red eye at the the pool, it’s not from the chlorine, it’s from urine. Or to put it more accurately, from the product of chlorine reacting with a chemical in the urine. In the water, chlorine easily reacts with uric acid, a chemical found in urine, and also in sweat, to form chloramines. It’s not surprising that this caught a lot of peoples’ eyes, especially since those product chemicals are linked to more than just eye irritation. But what’s really weird is what spurred this all on. It’s not a new study that finally proved this. It’s just the release of the CDC’s annual safe swimming guide and a survey from the National Swimming Pool Foundation. But this isn’t the first year the CDC mentioned this fact: an infographic from 2014’s Recreational Water Illness and Injury Prevention Week does and two different posters from 2013 do (the posters have had some slight tweaks, but the Internet Archive confirms they were there in 2013 and even 2012), and on a slightly related note, a poster from 2010 says that urine in the pool uses up the chlorine.

A young smiling boy is at the edge of a swimming pool, with goggles on his forehead.

My neighborhood swim coach probably could have convinced me to wear goggles a lot earlier if she told me it would have kept pee out of my eyes.

Here’s what I find even stranger. Last year there was a lot of publicity about a study suggesting the products of the chlorine-uric acid reaction might be linked to more severe harm than just red eye. But neither Bletchley, the leader of study, and none of the articles about it link the chemicals to red eye at all, or even mention urine’s role in red eye in the pool. Also, if you’re curious about the harm, but don’t want to read the articles, the conclusion is that it doesn’t even reach the dangerous limits for drinking water. According to The Atlantic, Bletchley is worried more that it might be easier for an event like a swimming competition to easily deplete the chlorine available for disinfecting a pool in only a short amount of time. This seems strange because it seems like a great time to bring up that eye irritation can be a decent personal marker for the quality of the pool as a way to empower people. If you’re at a pool and your eyes feel like they’re on fire or you’re hacking a lot without swallowing water, maybe that’s a good sign to tell the lifeguard they need to add more chlorine because most of it has probably formed chloramines by then.

Discussion of urine and red eye seems to phase in and out over time, and actually even the focus of whether its sweat or urine does too. In 2013, the same person from the CDC spoke with LiveScience and they mention that the pool smell and red eye is mainly caused by chloramines (and therefore urine and sweat), not chlorine. A piece from 2012 reacting to a radio host goes into detail on chloramines. During the 2012 Olympics, Huffington Post discussed the irritating effects of chloramines on your body, including red eye, and the depletion of chlorine for sterilization after many Olympic swimmers admitted to peeing in the pool. (Other pieces seem to ignore that this reaction happens and assume it’s fine since urine itself doesn’t have any compounds or microbes that would cause disease.) In 2009, CNN mentions that the chloramines cause both red eye and some respiratory irritation. The article is from around Memorial Day, suggesting it was just a typical awareness piece. Oh, and they also refer to a 2008 interview with Michael Phelps admitting that Olympians pee in the pool. The CDC also mentions chloramines as potential asthma triggers in poorly maintained and ventilated pools and as eye irritants in a web page and review study that year. In 2008, the same Purdue group published what seems like the first study to analyze these byproducts, because others had only looked at inorganic molecules. There the health concern is mainly about respiratory problems caused by poor indoor pool maintenance because these chemicals can start to build up. Nothing about red eye is mentioned there. In 2006, someone on the Straight Dope discussion boards refers to a recent local news article attributing red eye in the pool to chlorine bonding with pee or sweat. They ask whether or not that’s true. Someone on the board claims it’s actually because chlorine in the pool forms a small amount of hydrochloric acid that will always irritate your eyes. A later commenter links to a piece by Water Quality and Health Council pinning chloramine as the culprit. An article from the Australian Broadcasting Corporation talks about how nitrogen from urine and sweat is responsible for that “chlorine smell” at pools, but doesn’t mention it causing irritation or just using up chlorine that could go to sterilizing the pool.

Finally, I just decided to look up the earliest mention possible by restricting Google searches to earlier dates. Here is an article from the Chicago Tribune in 1996.

There is no smell when chlorine is added to a clean pool. The smell comes as the chlorine attacks all the waste in the pool. (That garbage is known as “organic load” to pool experts.) So some chlorine is in the water just waiting for dirt to come by. Other chlorine is busy attaching to that dirt, making something called combined chlorine. “It’s the combined chlorine that burns a kid’s eyes and all that fun stuff,” says chemist Dave Kierzkowski of Laporte Water Technology and Biochem, a Milwaukee company that makes pool chemicals.

We’ve known about this for nearly 20 years! We just seem to forget. Often. I realize part of this is the seasonal nature of swimming, and so most news outlets will do a piece on being safe at pools every year. But even then, it seems like every few years people are surprised that it is not chlorine that stings your eyes, but the product of its reaction with waste in the water. I’m curious if I can find older things from LexisNexis or journal searches I can do at school. (Google results for sites older than 1996 don’t make much sense, because it seems like the crawler is picking up more recent related stories that happen to show up as suggestions on older pages.) Also, I’m just curious about the distinction between Bletchley’s tests and pool supplies that measure “combined chlorine” and chloramine, which is discussed in this 2001 article as causing red eye. I imagine his is more precise, but Bletchley also says people don’t measure it, and I wonder why.

Grace Hopper, Pioneer of the Computing Age

A white woman in a naval dress uniform is pictured. Her arms are crossed.

If you’re seeing this on any kind of computing device on International Women’s Day, you should thank Dr. Grace Hopper, rear admiral of the US Navy. Hopper created the first compiler, which allowed for computer programming to be done in code that could more closely resemble human language instead of the essentially numerical instructions that work at the level of the hardware.

These “higher level” languages are what are typically used to create all the various programs and apps we use everyday. What have you done today? Word processing? Photo editing? Anything beyond math was considered outside the domain of computers when Hopper started work.