Community, A Science for Everyone

Why Frost Heaves

Frost heaves – back when I was teaching engineering uses of soils 40 years ago – were explained by osmosis, compression and fine-grained soil.  So think clay, or even better, glacial silt as your fine-grained soil.  For compression, remember that just ten or fifteen thousand years back there were some thick glaciers on top of our soil.  For osmosis, think of the difference between rain water, or snow melt, and the groundwater with it’s calcium salts under my field.  And glacial silts tend to have a lot of exchangeable ions.

The textbook phrase was “If a fine-grained soil, especially a clay, has been compressed, it will normally take in water upon reduction in compressive loading.  In many cases this intake of water may be attributed to osmosis.  The pore water already in the soil is assumed to move into regions of higher concentration between the particles.  More water is drawn into the soil in this process and gradually the water content of the mass as well as the soil volume are significantly increased. . . It has been observed that under certain conditions, when the ground freezes, the surface of the ground rises.  This is termed frost heaving.  Calculations show that in many cases the amount of heave is far more than can be accounted for by the expansion of water on freezing.  It is therefore assumed that additional water must be drawn into the freezing zone.” p.84, Hough, Basic Soils Engineering 1967

In areas that lack our fine-grained soils, frost heaves aren’t so obvious.  On our roads, the places that once had extreme frost heaves – I remember a particularly bad one that was in front of the Apeland ranch on 93 – were over excavated and the fine materials replaced with sand and gravel – so that osmosis could not occur. 

Community, Meteorology

Ice Pillars

It’s that time of the year again, or rather the temperature is that low again. Strange pillars of light in the sky? Ice pillars, or light pillars, form under conditions of very cold temperatures.

Edmonton, Canada -not my photo, I wasn’t about to stay out in the cold long enough to take one!

They are caused by light being reflected by crystals in the atmosphere, and careful observation of them can actually provide some insights about the weather. The source of the reflected light can be anything from the sun to streetlights. Color will vary depending on the light source.

Since these require very dense, cold air, with many ice crystals, they are common in polar regions.

Community, Meteorology

Measuring Temperature

There was a meme out a while back, pointing out the differences between Fahrenheit, Celsius, and Kelvin.  Measuring was a challenge in those early days – heck, measuring was a challenge to me after I had completed college classes on the topic.  Somewhere in the Glen Lake Irrigation District files of “as built” projects, my blunder on the Tamboer Siphon may still be recorded – I carefully picked the best spot for an inlet structure, numbered it 0+00 and began surveying.  A couple weeks later, I realized that I needed shots further upstream and had to start using negative numbers to finish the project.  It was a solution, but not an elegant solution.  After the experience, I started at 10+00.  Less mockery occurs when your mistakes aren’t so obvious.

Anders Celsius made a similar blunder – he set the boiling point of water at 0 degrees and the freezing point at 100 degrees.  Then as he continued his studies, he found that the boiling point of water changes with elevation (atmospheric pressure) while the freezing point of water was independent of both latitude and atmospheric pressure.  After Celsius died, the Royal Swedish Academy of Sciences noted that Celsius’ successors had reversed the measurements.  It does make more sense to start measuring from something constant.

The amazing thing about the Fahrenheit scale is that it came first.  Without a consistent scale on the thermometer, the extra energy involved in shifting from water to ice (or vice versa) makes precise and accurate measurements somewhere between difficult and impossible.  Fahrenheit chose to set his zero at the point that the reaction between ice, water and ammonium chloride quit working.  Once he had that, and marked his thermometer, he could repeat his experiment and determine that he had a consistent zero, based on a chemical reaction.  His next line was assuming the human body temperature was 100 degrees.  Then he could measure the temperature of ice water.  A bit of refinement, and freezing became 32 degrees, body temperature 96 degrees, and individual degrees could be measured by cutting the difference in half – 32 to 16, 16 to 8, 8 to 4, 4 to 2, and in 5 steps Fahrenheit had the gradations on his thermometer.  In the US we still use his method, though the rest of the world uses the modification of the Celsius system.

William Thompson (Baron Kelvin) came up with the Kelvin scale in 1848 – where zero was based on his calculations of absolute zero.  Thompson’s calculations showed absolute zero at -273 degrees centigrade.  In the following century and a half, his calculations have been corrected to -273.15. 

All told, it’s kind of humbling to see what these folks could do in the 18th and 19th centuries, without calculators and computers.  Thermometers of sorts were invented long before – but developing a universal measuring scale was long in coming.

A Science for Everyone, Community, Meteorology

Time to Look at Snow

In the last half of the seventies, the Monday after Christmas was committed.  I would meet Jay Penney at Graves Creek, get into the Snow Survey crummy and then we would measure the snow depth at Weasel Divide, Stahl Peak, and Graves Creek.  It’s so long ago that none of our measurements remain in the 30 year average.  We were the moderns – 440 cc Skidoo Alpines, and clockwork recorders that measured the snow-water equivalents through the month – all we needed to do was wind the clock and pack the chart away.  The guys we followed had done things differently – drive up Burma Road, snowshoe or ski to Weasel Cabin, build a fire, measure the snow course, eat dinner, sleep, hike into Stahl the next morning, measure the snow course, camp in the lookout, hike down, measure Graves Creek, reach the road and drive back into town.

My work was transitory – duplicating the traditional measurement dates and working with new recorders, battery power, early solar cells, and working with the technology that would make us unnecessary. 

My work was easier than my predecessors.  I used snowshoes where I couldn’t take a snowmobile.  Today, the remote monitoring is so good that I can click the link, and learn what the snowpack is on Stahl without leaving the warmth of my house.  Try it, you’ll like it.  https://www.nwrfc.noaa.gov/snow/snowplot.cgi?STAM8

DateTime PSTSnow Water Equivalent (inches)Snow Depth (inches)Snow Density (%)Precipitation To-Date (inches)Current Temperature (degrees F)
12/27/2021090018.967.02836.96.6
12/27/2021080018.967.02836.97.0
12/27/2021070018.967.02836.95.0
12/27/2021060018.967.02836.93.0
12/27/2021050018.967.02836.93.0
12/27/2021040018.967.02836.92.1
12/27/2021030018.967.02836.90.9
12/27/2021020018.967.02836.94.6
12/27/2021010018.968.02836.92.1
12/27/2021000018.968.02836.913.6

Nearly 19 inches of water in 67 inches of snow – 28% density, and warming after a near-zero night.  Of course, this is what would have been the January 1 run, and definitely not the time to announce whether the year was a high or low snowpack.  The next chart replaces the hand-written notes that Jay carried when I started, or that I carried after congestive heart failure took him off fortyfive time – 045 was the code we used for time spent on snow surveys.

26% above the thirty-year median.  It’s a number, but if we use it, we’re projecting from too little data.  Things can change with January and February’s snows – but above the mean is good.  Full soil profiles are good for plant growth and delay the susceptibility to fire.  And the Corps of Engineers paid that fortyfive time to get information to manage the reservoirs.

The next chart shows the 30 year mean, average and this year’s numbers in the lines – but the shaded area shows the variance.  You may note that by August 1, the snow is always gone, but the chart shows that it has melted off by the first week of June. 

As an old man, it’s good to be able to keep up on the information.  We did haul a lot of equipment in and out on those Alpines to help move toward the automated systems we have today.

A Science for Everyone, Community, Meteorology

I Could Visualize the Adiabatic Lapse Rate

Fall ended, and my winter started in December.  It may be due to a warming global temperature – but in the seventies, when much of my life was dedicated to snow surveys, I would have been explaining it by la nina.  Add the tilde to the second n – the Spanish word for little girl, the situation of the coast of Peru that increases precipitation here in the northwest.

I’m not one to complain about rain – one of the predictable portions of our climate is that early Summer has rain, and we tend to harvest alfalfa later than the optimal 10% bloom because of rain.  After July 4, we’re moving into the dry times that make drying hay easier – even if its a bit late.  You develop an appreciation for rain when your climate gives you long, hot dry spells.

This Fall, I could watch Mount Marston and Stahl Peak as the snow would come and go – I have a good view of their western slopes, and my thermometer lets me watch the difference in temperature.  I live at about 3,000 feet elevation.  The top of those two mountains is about 6,000 feet.  It’s one of the great things of living here – mountains are great, and altitude kind of sucks.  Nothing personal, but I like 3,000 foot valleys and 6,000 foot mountains a lot more than 6,000 foot valleys and 10,000 foot mountains.  My lungs fit better.

Back to the topic – the adiabatic lapse rate.  As you go up, atmospheric pressure goes down.  It is kind of obvious – as you climb the mountain, there is less atmosphere above you.  Less atmospheric pressure means that there are fewer particles of atmosphere – nitrogen and oxygen – in any particular unit of volumetric measurement you care to use.  Colloquially, the air is thinner.

It kind of makes sense – with more space between the molecules, molecules hit each other less frequently.  Fewer molecular collisions correlate with a drop in temperature.  (Physicists might invoke causation here – my training really doesn’t let me offer an explanation, but I can point out a correlation.)

So we need two tools to develop an understanding of the adiabatic lapse rate – the thermometer and the barometer.  Evangelina Torricelli invented the barometer in 1643.  Fahrenheit invented the alcohol thermometer in 1709, and a more useful mercury thermometer in 1714.  Paul Kollsman modified the idea of the barometer and developed a usable altimeter in 1928.

The adiabatic lapse rate is defined as the rate at which the temperature of an air parcel changes in response to the compression or expansion associated with elevation change, assuming no heat exchange occurs between the air and its surroundings.  Aviation, and icing wings gave an impetus to quantifying this rate of temperature change – and the need for weather forecasts provided even more.  The number is 5.2 degrees Fahrenheit for every 1000 vertical feet, or 5 degrees Celsius per 1000 meters.  (in the real world it can vary from 4 to 9 depending on humidity, etc)

So this Fall, with its snows and thaws, left me with elevation contours I could watch on the mountainsides – something that the deep snows of winter do not readily allow in the Spring as things warm up.  Since nobody came along and asked “What’s the temperature half-way up Marston?” it has been a private observation – but it has been fun to watch.

A Science for Everyone, Community, Demography

Our Communities by ACS Numbers

I listened to a comment about the median household income in Trego – and defaulted to my professional statement before retirement – “That’s American Community Survey data, and it’s not very good for small communities.”  When I checked it, the $36,458 median household income for Trego translates as “somewhere between $27,478 and $45,438.  ACS data has its uses, but it has to be used with a lot of caution.

So here’s a little ACS data on our communities – you can check for margin of error (MOE) here.   I wouldn’t recommend using any of the numbers without reviewing MOE – but just sharing the data shows the variance.  It’s safe to admit that my household was one selected for the ACS. With two retirees at home, I didn’t hurt Trego’s school enrollment rate, I raised the percentage of bachelors degree or above, kept the employment rate down, and raised the median age.

Trego CDPFortine CDPEureka CCDRexford Town
Population5153176,47078
Median Age60.527.950.153.3
Median Household Income$36,458$68,036$40,827$30,481
Bachelor’s Degree or more26.10%19.20%22.40%0.00%
Veterans6.80%16.20%12.90%16.80%
Poverty9.50%5.20%20.40%23.60%
School Enrollment97.80%72.30%81.90%100%
Employment Rate40.20%59.50%38.30%20.60%
Housing Units2831773,71673
Occupied Housing Units2371442,79646
Disabilities31.10%18.80%26.70%65.90%
Children under 189.30%32.50%22.10%13%

It looks like the Fortine sample drew some younger respondents.  Eureka CCD with a larger population and larger sample is probably closer to correct, and the town of Rexford data is probably close to useless because the small sample size almost guarantees sampling bias

Community, Demography

Trego and the American Community Survey

Montana’s American Community Survey is composed from the final interviews conducted with 10,138 households in the state.  Since Montana has 519,935 households, the chance of any household being in the final interview is 10,138 out of 519,935 = 0.0195, right around 2% of the population is included in the survey.  Since Tregp shows 295 households, we can guess that our community data is assembled from somewhere around 6 completed interviews.

This table about sample sizes is from kenpro.org

As you will note, a population of 250 calls for 152 samples, and that the break is in the lower right corner – while a sample of 382 covers a population of 75,000, 384 is good for a million.  I’m not going into details about sampling – this is a blog, not a stats class. If enough people ask for the stats instruction, I’ll do another article.  Suffice to say, we can expect the numbers on the ACS to be pretty vague in small communities. So let’s look at the ACS data: data.census.gov

The top of the page shows:

Total Population                                           515

Median Household Income                          $36,458

Bachelor’s Degree or Higher                       26.1%

Employment Rate                                        40.2%

Total Housing Units                                      283

Without Health Care Coverage                   9.0%

Total Households                                         249

As we go further down the page, we start to encounter the variance – the range the number represents.  That 515 population is taken from the decennial census not the ACS. 

Median Age:    60.5 +/- 3.2 years 

16.8% of Trego folks speak a language other than English at home – plus or minus 15.4% so that’s somewhere between 6 and 166.  Probably not a particularly useful piece of information.

That Median Household Income turns out to be plus or minus $8,980: the number can be as much as 24.6% off either way.  It can be as low as $27,478 or as high as $45,438.  As we move into the full chart on that number, we see that the number of households lists the margin of error as plus or minus 67.  Could be as low as 182 or as high as 316. 

That 26.1% of Trego residents with a bachelor’s degree or more has a 12% margin of error – it could be as low as 14.1% or as high as 38.1%,  It shows 8.1% of our residents holding graduate or professional degrees, but doesn’t give a margin of error there.

The school board will be pleased to know that 97.8% (plus or minus 6.6%) of our kids are enrolled in Kindergarten to 12th grade.  Might even surprise the County Superintendent.  Pretty sure some kids out there are home-schooled.

That 40.2% employment rate (+/- 12.9%) looks low – but I guess it fits right in with a median age over 60 and 31.1% (+/-9.4%) disability. 

And finally, there are 48 women 15 to 50 years old – but the margin of error is 38, so it translates to somewhere between 10 and 86.

The ACS data is good – but the sample for Trego was small, and not checking the limitations lets us make blunders.

A Science for Everyone

Thoughts on Banning Theories

I’m a sociologist.  I use theory to explain human behavior.  As a profession, we recognize our basic paradigms – Structural Functionalism, Conflict theory, and Symbolic Interaction.  In my use of these, Conflict theory is essentially the back of Structural Functionalism – one shows how societies work, function and their structure, while the other looks at the spots and time when conflict takes over.  Symbolic interaction deals with the fact that socially, we communicate with symbols.

I am more comfortable with conflict theory and symbolic interaction – but that doesn’t mean I can afford to ignore the Structural Functionalism paradigm.  It does explain some portion of our social world.  It’s basic Durkheim – and his thoughts are basic to my discipline.  He looked at how society worked.  Karl Marx, with conflict theory, looked at the spots where society did not work.  Karl, who wrote the four volume Das Kapital, essentially spent a lifetime studying capitalism and it’s flaws, it’s weaknesses.  He seems better recognized for the 50 pages of the Communist Manifesto – yet it does seem a little unfair that his major work is less recognized.

Still, it was conflict theorists who developed Critical Theory – Adorno, Foucault, etc.  Critical Theory differs from the paradigms I prefer in that it looks at critiquing and changingCritical theory society. My perspective is that my discipline should focus on understanding or explaining society.  No matter how good I am, I prefer not to make the decisions on how people should live.

That doesn’t mean there is no place in my world for Critical Theory.  Adorno’s work on the authoritative personality has a place to meld in with the basic social conflict paradigm.  Critical Race Theory operates from the assumption that a society based on values and beliefs that grew in Europe needs drastic change to improve society, based on race.  To my way of thinking, its origins with the legal profession move into a system of analysis that is scientifically weak – the idea of the “reasonable man” that is basic to legal understanding is not the same as scientific method. 

While it may be correct – but I want to examine the premise with a value-neutral approach and dig out as many statistics as I can.  My disagreement is not that the theoretical approach is useless – instead, my disagreement is that the methodology lacks scientific rigor.

I have the same problem with Creation Science – the folks who provide the answers tend to have arguments that are, at best, weakly supported.  That doesn’t mean I want to eliminate the theory – someone in the future may do a better job with it.  Likewise, I may see a better scholar working with Critical Race Theory.  To move away, into physics, Maxwell’s demon did provide an explanation that violated the second law of thermodynamics.  The fact that the little demon didn’t have face validity hasn’t stopped physicists from refuting the explanation for a century and a half.  We don’t need to ban theories – we need to test them responsibly.

A lot of people have attempted to use the concepts in Marx’ Communist Manifesto – so many that we can use the data we can harvest as if it were quasi-experimental.  I’ve watched communism work on Hutterite colonies – but it has several unique attributes that aren’t present in the Soviet system, or Cuba, etc.  While most Hutterite Colonies are successful, they include a religious commitment toward communal ownership, and I haven’t seen any colonies with walls to keep people from leaving. 

A Science for Everyone

The Quality of Data

We live in a world filled with data – but a lot of the presentations are slanted.  Sometimes the slant is political, sometimes the slant is a bizarre sense of humor.  I like Wikipedia – but I don’t rely on it.  I tapped in to look for a bio on George Washington Carver, and I read the damndest story about carving peanuts into busts of our first president.  If I want satire, I’ll go to the Onion or the Babylon Bee.  Wiki is accessible, fast, and I’ll continue to use it – but I check wiki data against other sources.  Using Wiki as a reliable source of data is similar to accepting President Biden as a fact-checker.

If I want information on shootings and murders in Chicago, I start with https://heyjackass.com/  It’s reliable, but not respectable.  They even sell T-shirts.  I’d never use it in a professional article – but whoever puts the data together does a pretty good job.  For example, as I write this, heyjackass shows

Year to Date

Shot & Killed: 586

Shot & Wounded: 2843

Total Shot: 3429

Total Homicides: 619

It’s a fast source of data that usually checks out. It even goes into neighborhoods, cause of death, race and gender – well, I’d say race and sex, since it lists male and female, but I may be a bit old fashioned.  It would be nice if all the violent cities had their own heyjackass, but this one seems unique to Chicago.

Climate data – at least the sort of data that shares first and last frosts, annual precipitation, and other medians gleaned from past records – is much more available.  For years, while some stuck with the Farmers Almanac, we carried with us Climate and Man – a 1941 yearbook of Agriculture that had compilations for most of the US.  Now, I can get online to check snow depth at each snow course, NOAA offers answers to all sorts of questions.  Climate data is vastly improved – though you still need to weed through and select reliable sources.  Personally, I stick with USDA and NOAA.

It is hard to find quality data on illegal immigrants and crime.  Texas’ Department of Public Safety provides data on crimes and convictions in Texas, but other states don’t provide data of similar quality.  I’m not sure we can generalize from Texas – but better data is hard to find.

The quality of data on abortion is impressive – each state provides data in a similar form.  You can sort between states and years – there’s a requirement that data be kept and published.  Unlike crime and illegal immigrants, this data is easy to access and use.

This publication presents itself as quality data: “30 Facts You Need to Know”.

Unfortunately, the folks who put it together didn’t include the links to those 30 facts that make them easy to confirm or reject.  I really don’t know which of the “30 Facts” I should accept and which ones should be rejected.

There is a lot more data available than there was in my younger days.  But a lot of that data is still less than easily confirmed – and a lot of folks are still trying to pass opinion off as fact.

A Science for Everyone

Quasi-Experimental Research and Old Ammo

Ammoland has an article that shows what we term “quasi-experimental” research at ammoland.com.

As a sociologist – studying people in groups – using experimental methodology has some ethical drawbacks.  So we’re probably more likely than most to look for situations that allow some of the inferences we can make without well designed, well controlled experiments.

This study isn’t sociology – it’s about how well 22 ammunition that has been stored for 65 years will work. Quasi-experimental research depends on serendipity – in this case, the research isn’t on 25-year-old ammunition or 50-year-old ammunition, like it might be in a designed experiment.  It’s on 65-year-old ammunition because that was the oldest stash left when a competitive shooter died. 

“Over 20 thousand rounds of the cache was Remington standard velocity ammunition obtained in or prior to 1956, transferred to quart jars from boxes by 1956. It was stored for 15-17 years in an attic in Madison, Wisconsin, then underground from 1970-72 to 2018 in a basement in Middleton, Wisconsin. After the ammunition was purchased from the estate, it was moved across the country, then stored in a secure underground location.”

Quasi-experimental – most 22 ammunition isn’t transferred to quart jars and sealed, but attic and basement storage is normal.  I’m not certain what qualifies as a secure underground location (it brings to mind finding a blasting cap box in the old root cellar – Dad quickly relieved us of that treasure).

The article is worth reading – the author documents reliability and group size – 2 relevant measurements.  If I were doing the research, I’d probably include some new Remington standard velocity as a control.  He used CCI – and it was likely as good a control as Remington subsonic would be . . . 65 years has probably seen as great a changes within Remington’s factory as between Reminton and CCI.

His results: “Velocity measurements for 50 rounds, average velocity, Standard Deviation, extreme spread in feet per second (fps).

  • CCI Standard Velocity:   Average 1072.3 fps, SD 17.5, extreme spread 84 fps, 1035 to 1119.
  • Old Remington Standard Velocity: Average 1098.9 fps, SD 19.8, Extreme spread 101 fps, 1041 to 1142.”

Weingarten describes how he intends to continue the test until 2056 or later.  For right now, his results suggest that I might have been better off to store my ammunition in canning jars- but it should still be reliable when Sam inherits it.  I’d really encourage reading the article – the experimental method isn’t confined to university campuses. https://www.ammoland.com/2021/09/shelf-life-22-rimfire-ammunition-test-65-year-old-ammo/#axzz77CythOXe