Agreeing(?) ( Two Disagree, when Now becomes Won, Part Two.) An Act of Five Parts #WrongKindofGreen #DistictionRebellion #ConquestofDough

Agreeing(?) ( Two Disagree, when Now becomes Won, Part Two.) An Act of Five Parts

Glassman Diag

https://twitter.com/PMotels/status/1094842262712922112

Out of the Memory Hole.
Dr G A Glassman on a pre-Recusant Judith Curry’s Blog.

More from Dr Glassman can be expected on this blog over the next few days,
Greta Thunberg could do with some Rocket Science 101 from the Good Doctor.

%0 Journal Article
%1 1309642
%A J. A. Glassman
%T A Generalization of the Fast Fourier Transform
%J IEEE Trans. Comput.
%@ 0018-9340
%V 19
%N 2
%P 105-116
%D 1970
%R 10.1109/T-C.1970.222875
%I IEEE Computer Society

http://www.wrongkindofgreen.org/2019/02/03/the-manufacturing-of-greta-thunberg-for-consent-the-house-is-on-fire-the-90-trillion-dollar-rescue/
https://en.wikipedia.org/wiki/Talk:Greta_Thunberg
Concerns of Corporate Greenwashing
Greta Thonberg responded [5] to concerns of corporate capture of her message which has been expressed by the Wrong Kind of Green [6], an Indigenous peoples environmental group. [7] “We attempt to expose those who undermine the People’s Agreement. One role of the non-profit industrial complex is to undermine, marginalize and make irrelevant, the People’s Agreement. The reason being, to protect corporate interests by which they are funded. As well, the non-profit industrial complex protects the industrialized, capitalist economic system, responsible for the capitalist destruction of our shared environment. Those groups who continue to protect such interests must be considered complicit in crimes against humanity.” — Preceding unsigned comment added by RogerGLewis (talk • contribs) 02:39, 11 February 2019 (UTC)
https://en.wikipedia.org/wiki/Talk:Extinction_Rebellion
http://www.wrongkindofgreen.org/2019/01/28/the-manufacturing-of-greta-thunberg-for-consent-the-most-inconvenient-truth-capitalism-is-in-danger-of-falling-apart/#comment-300785
https://clivelord.wordpress.com/2019/02/03/monbiot-attenborough-and-greta-thunberg/
https://en.wikipedia.org/wiki/Clive_Lord
https://www.ecowatch.com/greta-thunberg-climate-strike-2627956100.html
http://www.wrongkindofgreen.org/2019/02/03/the-manufacturing-of-greta-thunberg-for-consent-the-house-is-on-fire-the-90-trillion-dollar-rescue/
http://www.wrongkindofgreen.org/about-us/

I am addressing some editing question on Wikipedia at the moment as well as working on a video on the Maths and Finite analysis tools which were developed back in 1970 by Dr Glassman.

Science and Skepticism

By Jeff Glassman, PhD

In AudioRest Day/Theory

May 02, 2009

Audio Article

Jeff Glassman lectured about the nature of science itself at the Science of Exercise seminar on April 25th, 2009. Some have said that science is like love in that we can’t define, but we know it when we see it. Jeff disagrees. There are certain criteria that differentiate science from non-science, and it’s essential to identify them.

Key terms are defined, such as models, measurement and prediction, conjecture, hypothesis, theory, and law. And, the nature of logic, analysis, knowledge, and error are described.

These terms are used to explain a variety of concepts, such as why gravity is science and creationism and peer review are not.

Jeff has made his Lecture Notes available.

58min 38sec

Glassman Diag 2

Free Download

Comment

http://www.rocketscientistsjournal.com/

Jeff Glassman | March 3, 2011 at 1:27 pm |
Re Agreeing(?) 2/26/11
When an individual is assessing [Zeke’s statements], the epistemic level of the assessor is relevant. I propose the following levels:
1. Research scientist publishing papers on relevant topics
2. Individual with a graduate degree in a technical subject that has investigated the relevant topics in detail.
3. Individual spending a substantial amount of time reading popular books on the subject and hanging out in the climate blogosphere
4. Individual who gets their climate information from talk radio
Note: personally, I would rate an epistemic level of 1 on some of the topics, and level 2 on others.
The context is clear enough: assessor means a voter contributing his opinion to Zeke’s (and IPCC’s) pole of subjective likelihoods, quantified with equally subjective probability numbers. In that setting, and for some sense of purity in the art of opinion surveying, the assessor levels might be relevant. That poll could be scientific in the field of opinion sampling, but not in the sense of the substantive, climate elements in Zeke’s statements. The poling process would be equally valid for the proposition “multiple supreme beings exist” (to enlist the monotheist reader in the argument), and the supporting statistics of opinion.
Zeke and Curry agree that there is no serious challenge to Keeling’s measurements … . Does this agreeing mean that they are unaware of any Level 1 challenge in a peer-reviewed climate journal? Is the operative word serious? Are Z&C distinguishing between concentration measurements at MLO and the reconstructions accepted by IPCC and widely known as the Keeling Curve? AR4, Figure 2.3(a), p. 138. Do Keeling’s measurementsinclude the isotopic measurements, and IPCC’s reductions to fingerprints? Id., Figure 2.3(b). Do they include the missing wind vector by which samples were deemed valid for the reconstruction? Have Z & C considered the fact that MLO lies in the exhaust plume of massive oceanic outgassing in the Eastern Equatorial Pacific, and that Keeling cautioned against relying on measurements near sources or sinks?
Z&C agree on Extremely Likely statement 5: The majority of the increase in carbon dioxide concentrations since pre-industrial times is due to anthropogenic emissions of carbon dioxide. This is confirmed both by the isotopic signature of the carbon and the fact that concentrations rise proportionate to emissions. Both signatures are the product of IPCC’s chartjunk. Hiding the Decline. Part IV: Beautiful Evidence, undersigned, 2/25/11 at 1:16 pm. Is this agreeing based on another Level 1 “accumulation of evidence” from peer-reviewed climate journals?
If the intent of the epistemic level applies beyond weighing opinions, then it might be relevant to the level of courtesy extended, and the depth presented, in answering the concerns of people – people who express an opinion or curiosity about the substance of the science of climate. However, the levels read like a recipe for formalizing ad hominem responses. If that is the intent, it is arrogant and elitist. That view would be underscored if publishing papers is to apply only to peer-reviewed journals.
To make matters worse, the statement of Level 4 could have been taken from a political left-wing tract. Clearly the left has adopted AGW as a plank in its platform, but the wording of Level 4 coming from the climate community tends to confirm what the right wing accuses: that climate science itself is a political faction of the left. If that is what is to be conveyed, it might be punched up by including Fox News Channel and Glenn Beck, its conspiracy theorist du jour.
Science is entirely about models of the real world with predictive power. Epistemologically, science is the objective branch of knowledge, and, except for certain honors and accolades, devoid of the subjective, of belief systems, of voting, and of academic and bureaucratic hoops, like publication and peer-review. The scientific method is a logical structure composed of definitions, logic, facts, cause-and-effect constructs, predictions, and validation. The prerequisites of publication and peer-review have academic and quality control implications, but are distractions in the scientific method. To the extent that a scientific model has, or fails to have, predictive power, whatever else might be said is trumped.
The epistemology in the matter of climate lies in the quality of supporting scientific models, usually graded in increasing order from conjecture, to hypothesis, theory, and law. The epistemic level should be the quality of an inquiry, whether posed as a question or proposed as an alternative, as it relates to science and the scientific method, independent of the character or credentials of its source. “Dr. Curry, why do you think CO2 is long-lived?” deserves a full dialog with even a Level 4 citizen.
To avoid wallowing about in philosophy on the one hand, or in the minutiae of the AGW model, the matter needs to be brought into focus on the monumental underlying question. The United Nations IPCC has rung the alarm. It claims that an out-of-control industry has launched an irreversible movement toward human catastrophe, a movement that can be only mitigated by what amounts to finishing off what remains of the World’s crippled economy, all in the name of climate and the hope of cutting back carbon dioxide emissions.
IPCC lays this matter expressly and squarely before the World’s “Policymakers”, impliedly and especially the United States Congress as representatives of the (disproportionately) largest contributor to both the investigators’ purse, and to the CO2 emissions. That Congress has just responded by advancing a statute to defund IPCC. Whether that survives to be law is a short-term matter, but the question will be “extremely likely”, i.e., “>95% probability”, a campaign issue before U.S. voters next year. If the climatology community wants to keep its seat on the gravy train, in case that train ever pulls out of the station, it had better satisfy those with Level 4 doubts.
The ultimate climate question can be morphed into various forms, but it is now public, out of the hands of climatologists, scientists, philosophers, academics and journal editors. It is a question for laymen, who best fit the politically charged Level 4. These laymen include policymakers, their staffs, political candidates, lawyers, preachers and their flocks, people who get their information from the Internet, from posters, from the streets, and “more likely than not”, “>50% probability”, the most populated category, the uninformed. About half of them are going to vote next year, and the result will be determined by a distorted measure of the mean opinion, dominated by Level 4 as generalized.
The scientific question put to the public is whether to give credence to IPCC’s alarm. So the problem has its origin in IPCC’s work. Early on, IPCC assumed that AGW exists. IPCC, Principles Governing IPCC Work, 10/1/1998, p. 2. Then it adjusted its models to agree on a prediction of just the right amount of global warming, too small to be invalidated before it is funded, but large enough to loosen public purse strings immediately. This warming must be from various, plausible CO2 emission scenarios, and not from natural causes, of course. IPCC’s core assumption that AGW exists is the first hole in its argument. It lends itself to a simple, binary hypothesis: the null hypothesis, H0, is that AGW does not exist, and the converse, H1, is that AGW exists.
The evidence for the affirmative, H1, is what can be inferred from IPCC Reports with their data and modeling results. Those Reports comprise a condensation of opinion from over 3,500 experts, plus several hundred editors and reviewers, and the content of over 6,000 peer-reviewed papers. Adding more experts, more peer-reviewed papers, or more GCM runs doesn’t amount to significant evidence for the affirmative hypothesis. That is evidence considered cumulative in the legal sense, disallowed in court, and deserving similar fate in the public climate question. When the accumulating evidence comprises measurements, however, then the model can acquire statistics for its predictions — real probabilities this time, not fantasy probabilities. Such facts may be cumulative in law, but they are scientifically significant. They provide probability distributions for predictions. Without probability bounds modeled on facts, most model predictions would be meaningless.
IPCC Reports are designed to show the consequences of the AGW assumed to exist, not to support the existence of AGW. However, IPCC does dedicate a chapter to its evidence against the null hypothesis. It says, The role of natural internal processes can be estimated by studying observed variations in climate and by running climate models without changing any of the external factors that affect climate. AR4, Chapter 9; FAQ 9.2 Can the Warming of the 20th Century be Explained by Natural Variability?, p. 702. Those models are born-again weather emulators, biased by design at their rebirth against natural causes and in favor of human causes of climate change. Some design choices are biased by misfortune, as in the choice of the radiative forcing paradigm. More important biases arise from not just a lack of diligence in modeling natural forces, but from discouragement to do so. The models cannot supply evidence for what they were designed to minimize. The models don’t work under H0; the null hypothesis requires different models.
Evidence that IPCC erred in its modeling is evidence to reject H1. As Oreskes’ famous study shows, (e.g., Undeniable Global Warming, NY Times, 12/26/04), H0 evidence is not to be found in peer-reviewed climate journals. http://www.sciencemag.org/content/306/5702/1686.full . Therefore, it is excluded from peer-reviewed climate journals, and in large part excluded from Level 1. Nonetheless, it is both extensive and available.
Evidence supporting the null hypothesis is in hand. Stott, et al., 2003 (see thread Hiding the Decline. Part IV: Beautiful Evidence) discovered a fragmented fingerprint of solar activity in HadCM3 runs forced with amplified solar models and regression of the output against the instrumented temperature record. They discovered a statistically significant amplification factor in the climate of 2.78 using the best solar model then available. AR4 dismissed the paper on mistaken and irrelevant grounds, leaving its models still without an amplification factor. The fact that the GCMs do not respond to solar variations is not evidence that the climate is insensitive to them. The sensitivity to solar variations exists in the temperature record, and the GCMs need repair to account for the facts.
By relying on a GCM, Stott’s analysis omits the reaction of the ocean’s heat capacity, and the strong, positive and negative feedbacks of cloud albedo, two of the omissions in GCMs that also serve to negate the affirmative hypothesis. Stott, et al., investigated the sun/climate relationship by regression analysis on simultaneous records. That they still found mathematical evidence for the amplification factor at zero lag was especially good luck. Had they applied reasonable physical models for the integrating and lagging (low pass filtering) response of the ocean, and the positive feedback of cloud albedo from the burn off effect, they could have discovered that solar activity can account for the full, 140-year instrumented temperature record. Using the updated, IPCC-preferred, solar model by Wang, et al., 2005, Earth’s surface temperature follows the Sun with an accuracy of 0.11ºC (1σ), a value comparable to the variability in IPCC’s smoothed temperature reduction for the period. , rocketscientistsjournal.com, 4/17/10.
IPCC’s conclusion that Anthropogenic warming of the climate system can be detected in temperature observations taken at the surface, in the troposphere and in the oceans (AR4, Ch. 9 Executive Summary, p. 665) is contradicted in the first order. The temperature lags the Sun, and whatever patterns might have been deemed significant to be attributed to humans cannot exist on the Sun. Independent evidence shows that the attribution to humans of the large signal, 1ºC rise in Earth’s global average surface temperature over the last century is erroneous, and confirms the non-existence of AGW.
• Terry Oldberg | March 4, 2011 at 1:12 am |
To grade individuals by their “epistemic levels” seems to me to be a distraction from the issue that is of importance to the people of the world. The epistemic failure of individuals of epistemic level 1 is evident in the failure of these individuals to identify the set of observed independent statistical events that would prove false the claims of these individuals.
◦ Jeff Glassman | March 4, 2011 at 1:30 pm |
Terry Oldberg 3/4/11 1:12 am
Epistemic level grading is a multi-pronged distraction.
It is nominated to be an official distraction for certified climatologists who may apply it to safely ignore external criticism. It’s a softer version of CRU ridiculing the quality of non-conforming journals, or conspiring to shun a journal that published a heretical article and suggesting the journal be stripped of its peer-reviewed status.
The levels are intended to distract the policymakers who fund climatologists and their labs, and who are supposed to restrict carbon emissions. The distraction is from those deemed not qualified to object, reducing science to selected expert testimony. Only the Preacher may interpret the Word — the others are heretics or the unwashed indigenous for the harvest of cultural imperialism.
The four levels are also a distraction from the fact that climatologists do not actually sit atop the tree of epistemology. In the epistemological hierarchy, climatology is subservient to the laws of physics, to thermodynamics, to system science, to science in general, and to ethics.
Putting on a Level 0 hat, here are some particular ripe examples:
Laws of physics: IPCC disregards the Beer-Lambert Law, which governs radiative transfer. It disregards Henry’s Law of solubility, governing CO2 flux, to make natural CO2 more soluble than the anthropogenic variety.
Thermodynamics: IPCC models the surface layer of the ocean in thermodynamic equilibrium, essential to the manufacture of its false bottleneck for human CO2 emissions.
System science: IPCC thinks feedback, which it adapted from system science, is either an illusory relation between correlated processes or a computer parameter calculated at run time. Feedback actually is the transfer of material, energy, displacement, potential, or information from within a system that modifies the system’s inputs. As a result, IPCC models climate open-loop, as, for example, with cloud albedo, and is unable to calculate any closed-loop gain.
Science: IPCC models climate without accounting for the known, dominant climate events of the past (e.g., the ice ages, the interglacial epochs, the Medieval Warm Period, the Little Ice Age, sudden changes in the slope of temperature instrument record), and predicting only an unverifiable catastrophe. By inference, its models predict a near term trend in temperature, which has failed validation. IPCC models, which don’t even pretend to work in the long run of past epochs, nor in the short run of weather, are advertised to predict a looming, mid-term catastrophe. It’s like accounting with the least significant the most significant figures obliterated, leaving just the middle, insignificant figures.
Duty & Ethics: IPCC violates its public trust by employing chartjunk to create faux fingerprints of human activity on atmospheric CO2. IPCC in its quest for recognition, control, and profit relies on an unvalidated model, at best a hypothesis, to foster public panic.
◦ Brian H | March 6, 2011 at 11:52 pm |
Re: Jeff Glassman ,
Excellent summary and indictment. Thanks.

◦ Jeff Glassman | March 27, 2011 at 9:51 pm |
Pete Ridley, Agreeing(?), 3/26/11 2:41 pm
1. Re splicing ice core data onto modern measurements: Assume for the moment that the ice core data are properly calibrated so that the CO2 concentration is correctly for the time and place of firn closure. However, that process low pass filters the records mechanically in the range of many decades to many centuries. Instantaneous CO2 concentrations are irreversibly lost in the ice core record. By contrast, the MLO CO2 concentration aperture time is approximately one minute, and essentially instantaneous. Even calibrating the ice core data to match the instrument record, the slopes should not match. To compare the two, one might imagine low pass filtering the MLO CO2 data with a comparable lag. The ice core data is insensitive to an epoch of 50 to 100 years, such as that observed in the full MLO record. A computer algorithm is necessary to blend the ice core and instrument records into smooth records.
2. The Hockey Stick advanced by Mann et al. (1998) and revised by the authors in 1999 was, as you note, the featured hallmark of IPCC’s TAR. The Stick erased the Medieval Warm Period and the Little Ice Age. The authors then spliced their overly-smoothed tree-ring temperature reconstruction smoothly into the modern thermometer record. This had the effect of making the temperature record (a) benign before the industrial era and rising with man’s CO2 emissions thereafter, and (b) in the modern era, unprecedented at least for the past millennium. Thus Mann and IPCC produced evidence that (a) man caused the observed temperature rise and (b) that the rate of rise is at an alarming level.
3. IPCC uses the coincidence of the industrial era and the rising temperature as evidence that CO2 causes warming. It uses the fact that the modern records are unprecedented compared to reductions over the last millennium to bolster its conclusion that man is the cause of consequential global warming. These are naïve, unscientific principles. They are corollaries of the clichéd caution that correlation does not establish cause and effect. They do not validate AGW.
4. You can add SO2 to your list of gases IPCC hockey sticked.
5. Following the TAR in 2003, first Soon with Baliunas and then McIntyre with McKitrick published papers critical of the hockey stick reduction. Soon, relying on a variety of regional and local climate indicators, concluded somewhat ambiguously that the MWP and the LIA must have been real, global epochs. McIntyre, correcting several errors discovered in Mann’s analysis, showed that Mann’s Northern Hemisphere method not only produced temperatures greater than the modern era, but did so during the depths of the Little Ice Age. McIntyre’s results are not only critical of Mann’s statistical methods, but call tree-ring reductions into question. That same year, Mann and Jones published a paper sustaining their previous results, and adding somewhat similar results for the Southern Hemisphere and for global temperatures. The major difference in the new reductions was that the dendroclimatology results no longer smoothly blend into the instrument record.
6. The analyses by Soon and McIntyre suggest that Mann and IPCC manufactured evidence for their claims.
7. IPCC briefly reported the controversy in AR4, but did not withdraw Mann’s hockey stick reduction. It retained the hockey stick reduction, but obscured it by burying it in a spaghetti graph of a dozen or so contemporary reductions. In addition, IPCC provided new definitions for the MWP and the LIA. In the First Assessment Report of 1990, the temperature anomaly circa 1975 was about 0.09ºC (interpreted from subsequent IPCC’s charts), the MWP contained the maximum of about 0.47ºC, and the LIA contained the minimum of -0.61ºC, each global over the period of 900 to 1975. IPCC now defined the MWP simply as being warmer than the LIA, and vice versa, and applicable just to the Northern Hemisphere. By definition now, Mann et al.’s analysis was appropriately regional, and no longer deleted these two epochs.
8. Prof. Alley’s AGU talk was IPCC Animated and Prognosticated. He denied that the best model for Total Solar Radiation could predict the Global Average Surface Temperature, when that has been demonstrated with an accuracy comparable to the variability of the smoothed temperature record itself. He ignored the solubility of CO2 in water, and that at Vostok, CO2 record accurately followed the temperature record according to Henry’s Law. He never mentioned cloud albedo, the most powerful feedback in climate. He ignored that it is a negative feedback to climate sensitivity, for which he gave IPCC’s open loop values. He also dismissed cloud albedo as a positive feedback to solar variations, which IPCC also evaluated open loop. He dismissed that positive feedback in reliance on Svensmark’s galactic cosmic ray conjecture, never recognizing the cloud burn-off effect, nor published correlations between GAST and GCRs and between GAST and TSI. He never mentioned water vapor, the dominant GHG, and that its effect is strongly temperature dependent according to the Clausius-Clapeyron equation. To his credit, he alluded to post AR4 evidence for acidification by atmospheric CO2. However, he never mentioned that the model for CO2 acidification, based on the quantitative Bjerrum theory, requires à priori that the surface layer of the ocean absurdly be in thermodynamic equilibrium.

2.

1. Jeff Glassman | March 29, 2011 at 10:05 pm |
Pete Ridley, Agreeing(?), 3/28/11 2:41 pm
Hey, Pete!
By “that process” I meant the model by which a gas age can be estimated as a function of core depth for any gas. Values for that parameter are sufficient to estimate the time span for the gas released from a sample, given the length of the sample. The trapped gas concentration should be proportional to the integral product of the prevailing gas concentration over at least the time span of the sample with a sampling function. That is the low pass filter.
Investigators sometimes write about this process as if the sampling function were uniform over the time span. Whether that is true or not, uniform ought to be a good, first order approximation, and climate modeling is still searching for a valid first order model. A uniform sampling function ought to produce a first order low pass filter.
If the sampling function is approximately bounded, the width is the lower bound for uncorrelated sampling.
Following your reference to Severinghaus, I found the 2008 paper by Kobashi, Severinghaus, and Kawamura, Argon and nitrogen isotopes of trapped air, etc.. Hasn’t Severinghaus gone much further with the model than you give him credit? He and the others discuss several mechanisms for gas mobility, including thermal fractionation, and both lateral and vertical modes. They speak of gas leaking during close-off, losses due to fracturing, and losses during coring and storage, and, presumably, analysis.
You use the phrase “complete close-off”, (“sealing”, too, in Kobashi, et al.), but in light of the above, does that ever exist? Your argument, too, seems to be that it does not. Along that line of inquiry, I have a problem with the simple notion of a fixed number, like 0.39 nm, for porosity. I would feel more comfortable with that number stated as the mean.
Doesn’t your concern about molecule size reflect on the calibration of the ice core results? It would affect the proportionality factor use in my integral, above. I presume that your work would in some sense improve the curve of ice age as a function of depth, as well as the conversion from sample concentration to atmospheric average.
I don’t follow your argument that these diffusion and fractionation considerations show that ice cores tend to produce hockey sticks. For the one truly important curve, CO2, the Vostok reductions have lots of character, which is significant compared to the temperature reduction. However, an improved calibration isn’t going to change the first order conclusion that ice core CO2, like other gas analyses, is heavily low pass filtered, and should not blend smoothly into the instrument record, even as it has been subjected to heavy-handed processing to produce the Keeling Curve.
The SIM photos are great.
Re sulphate, IPCC co-plots SO2 and SO4^2- in TAR, Technical Summary, Figure 8, p. 36. It plots SO4^2- alone in AR4, Figure 6.15, p. 480. In each case, IPCC blends, pastes, or co-plots ice core reductions to make heavily low-pass filtered reconstructions match smoothly into nearly instantaneous instrument records. IPCC did not intend its Reports to be complete or scientific, and they succeed in both regards with these hockey stick representations.
You asked why these IPCC are suspect. Here are three reasons.
(1) Low pass filtered data will not match unfiltered data. Low pass filtering occurs in ice cores because of the gas dependent aperture effect below the firn. The molecule size, porosity, and temperature profile are parameters that affect the depth at which the aperture effect occurs, but not the ultimate existence of the aperture, given sufficient pressure. That the calibration of ice age with depth might be improved is neither doubted nor relevant in the first order. The fools gold can be taken at face value.
(2) Data from different methods, as in proxies and instruments, should be portrayed honestly and quantitatively. Calibrating, smoothing, blending, and graphing to create visual correlation, a subjective illusion, is dishonest, and in dealing with the public, unethical. Any reduced, blended composite representation should be shown separately, clearly as the investigator’s subjective interpretation, and best as an overlay to data close to the raw state.
(3) IPCC’s hockey stick message is that man is affecting climate. It relies on the naïve and quite unscientific principle that correlation establishes cause and effect, and its corollary, the unprecedented principle. That man developed instruments to measure climate early in the industrial era is not a coincidence.
Climatology needs to be candled not by peer review, but by its higher authority, science.
◦ Jeff Glassman | March 30, 2011 at 10:17 am |
Pete Ridley, Agreeing(?), 3/28/11 2:41 pm
My description of “that process” is misleading. I have the right number of functions but in the logical wrong order, and I gave the kernel in the integral an unfortunate name of a sampling function. It is better described as a smoothing function.
Consider the actual CO2 concentration at Vostok over its time span. Then if we had a decompressed ice core, the CO2 concentration below the firn would be a version of the actual concentration, filtered by the smoothing function. It might be uniform to a first approximation. It might have rounded ends because the closure is a gradual decrease in porosity relative to the molecule size. Regardless, the concentration along the decompressed ice core would be the actual concentration smoothed by that function. Now recompress the ice core, cut it into samples, and measure the gas concentration.
The difference is that my first description in the post above can be falsely read as attributing the low pass filtering to the fact that the sample has a non-zero length. The low pass filtering is in the concentration recorded in the core, and even if the core could be read with samples of infinitesimal width, the low pass filtering would be present.
This is difficult to describe for lack of references that give the actual sample widths and the actual gas age algorithm corresponding to the published Vostok data. In my model, the ice at any depth contains CO2 both older and younger than the ice age. This seems to taken into account by the gas age, although the gas age also seems to dated as if the smoothing were one-sided, looking back in time relative to the ice age.
“That process” is equivalent to sampling the actual data with an unrealizable low pass smoothing filter, that is, one that looks ahead in time as well as back in time because of the mechanics of firn closure. The fact that the samples have width introduces an averaging effect in addition to that from firn closure.

2. Brian H | April 2, 2011 at 6:08 am |
Jeff;
Your repetition of the “low pass filtering” phrase is a bit obscure for us non-rocket-scientists. I take it that you mean, in effect, smearing of the CO2 types and quantities above and below the sampled level. Carried to extreme, this suggests there would be one low peak somewhere in the record, and a shallow straight slope to the low point.
So the Vostok (or any ice core) record isn’t jagged enough to give detailed information, and suppresses all phenomena narrower than the average smearing width.
Is that close enough?
◦ Jeff Glassman | April 2, 2011 at 12:37 pm |
Brian H, 4/2/11, 6:08 am, Agreeing?
But not climatologists, pro or amateur. Low pass filtering is just one more bit of science adopted into climatology. (In this regard, do see my response to Fred Moolten, 3/28/11, 7:55 pm, on An essay on the current state etc.) As to your question, IPCC has a one page tutorial on Low-Pass Filters and Linear Trends, AR4, Appendix 3.A, that should provide the answer. Here’s a postage stamp size tutorial.
In short, a low pass filter allows the constant part of a signal to go through while rejecting high frequency components.
In engineering terms, you’ll hear the constant part called the DC, standing for direct current. The high frequency components are often, and regularly in climatology, the parts identified as variability. The meaning of high depends on the application and the choice of the investigator. The low pass filter can be extremely simple, or quite complex. These are called low order and high order, respectively, and the lowest order are often called first order.
Low pass filtering causes a time delay and an attenuation. The time delay is a transient effect to the output, and it causes a ramp at the input to lag in time. The attenuation depends on the frequency of the input signal variations, which is the essence of the filter: the higher the frequency of a variable component, the greater the attenuation. The primary and corresponding characteristics of low pass filters are its time constant and its variance (not variability) reduction ratio.
Averaging, smoothing and trend lines are all examples of low pass filtering.
All detectors are low pass filters because they must have a time aperture over which the signal is collected. Even instantaneous thermometers are not instantaneous. Modern CO2 analysis is a low pass filter with a time constant equal to the time required to collect the sample, which is as much as a minute or two in the manual mode. The variance reduction ratio is inversely proportional to the square root of time aperture.
Ice cores collect air samples over a protracted interval, ranging from a few decades to a few millennia. The variance reduction ratio compared to MLO is on the order of 23,000:1, the square root of the number of minutes in a millennium. The composition of the air in each sample is proportional to the average in the air over that closure interval. All quantities in the air are low pass filtered. The characteristics of the low pass filter are gas dependent, because firn closure depends on molecule size as the core goes through compression, decreasing porosity.
My summary of the Vostok low pass problem is two-fold. One, it can barely respond to an event like the rise in CO2 seen at MLO. The 50 year or so MLO bulge in CO2 would be just a bit of variability at Vostok. Two, the sampling interval at MLO is about 1,300 years, so the chances of even hitting an MLO event is only about 3%. When IPCC says that the modern CO2 concentration is unprecedented over the last 620,000 years, the confidence interval is about 3%. The observation is ludicrous.

http://www.wrongkindofgreen.org/2019/02/03/the-manufacturing-of-greta-thunberg-for-consent-the-house-is-on-fire-the-90-trillion-dollar-rescue/
https://en.wikipedia.org/wiki/Talk:Greta_Thunberg
Concerns of Corporate Greenwashing
Greta Thonberg responded [5] to concerns of corporate capture of her message which has been expressed by the Wrong Kind of Green [6], an Indigenous peoples environmental group. [7] “We attempt to expose those who undermine the People’s Agreement. One role of the non-profit industrial complex is to undermine, marginalize and make irrelevant, the People’s Agreement. The reason being, to protect corporate interests by which they are funded. As well, the non-profit industrial complex protects the industrialized, capitalist economic system, responsible for the capitalist destruction of our shared environment. Those groups who continue to protect such interests must be considered complicit in crimes against humanity.” — Preceding unsigned comment added by RogerGLewis (talk • contribs) 02:39, 11 February 2019 (UTC)
https://en.wikipedia.org/wiki/Talk:Extinction_Rebellion
http://www.wrongkindofgreen.org/2019/01/28/the-manufacturing-of-greta-thunberg-for-consent-the-most-inconvenient-truth-capitalism-is-in-danger-of-falling-apart/#comment-300785
https://clivelord.wordpress.com/2019/02/03/monbiot-attenborough-and-greta-thunberg/
https://en.wikipedia.org/wiki/Clive_Lord
https://www.ecowatch.com/greta-thunberg-climate-strike-2627956100.html
http://www.wrongkindofgreen.org/2019/02/03/the-manufacturing-of-greta-thunberg-for-consent-the-house-is-on-fire-the-90-trillion-dollar-rescue/
http://www.wrongkindofgreen.org/about-us/

I am addressing some editing question on Wikipedia at the moment as well as working on a video on the Maths and Finite analysis tools which were developed back in 1970 by Dr Glassman.

Climate Etc.

by Judith Curry

On Lucia Liljegren’s Blackboard (commonly categorized as a “lukewarmer” site), Zeke has a post titled “Agreeing.”    Zeke’s motivation for this is:

My personal pet peeve in the climate debate is how much time is wasted on arguments that are largely spurious, while more substantive and interesting subjects receive short shrift. While I’m sure a number of folks will disagree with me on what is spurious vs. substantive, I think it would be useful to outline which parts of the debate I feel are relatively certain, are somewhat uncertain, and quite uncertain.

View original post 1,134 more words

Advertisements