Bhs1975 Posted April 6 Share Posted April 6 Related to the CO2 fertilization post above, the recent study below accounted for the albedo effect of planting trees. In many places around the world, planting trees causes warming, as increases in absorbed sunlight offset the benefit from CO removal.https://www.nature.com/articles/s41467-024-46577-1 So we would have to remove it using renewables and sequester it underground by using reactions to form limestone.. 1 1 Link to comment Share on other sites More sharing options...
chubbs Posted April 6 Share Posted April 6 50 minutes ago, Bhs1975 said: So we would have to remove it using renewables and sequester it underground by using reactions to form limestone. . Don't get me wrong there are many reasons to plant trees and trees produce a net cooling in many parts of the world. Once it is in the atmosphere removing CO2 will be costly and slow. Using renewables/batteries/electrification to not emit CO2 in the first place is by far the cheapest strategy, particularly with costs continuing to drop. 1 Link to comment Share on other sites More sharing options...
ChescoWx Posted April 6 Share Posted April 6 12 hours ago, chubbs said: This recent study shows that global heat waves are becoming more frequent, lasting longer, covering larger areas, moving slower, and bringing higher temperatures. https://www.science.org/doi/10.1126/sciadv.adl1598 Compounding the effects of high temperature, global dew point and wet bulb temperatures are also rising. Of course climate did not begin in 1979....perhaps you overlooked that one.... Link to comment Share on other sites More sharing options...
ChescoWx Posted April 6 Share Posted April 6 The below with a longer more appropriate time frame than Charlie showed above shows that the actual frequency of daily high temperature records in the U.S. [avg across ~540 GHCN-daily record stations] has actually decreased since 1911. The mean over the last five decades is almost 21% lower than the 1911-1960 average. Cold extremes have decreased too, and rather sharply since the 1980. Again can we say cyclical normal climate change and nothing at all to be alarmed at here?? Link to comment Share on other sites More sharing options...
ChescoWx Posted April 6 Share Posted April 6 12 hours ago, chubbs said: Chesco didn't show us Figure 1 and 2, must have been an oversight. hey Charlie.....there is actually weather data before 1960.....some folks don't like that data - well unless we make some chilling adjustments as it does not support the non-stop warming story. The above charts I added do not omit the older data back in our last warming cycle that you likely forgot to add... Link to comment Share on other sites More sharing options...
ChescoWx Posted April 6 Share Posted April 6 We are told by alarmists that warming of the climate is expected to increase the frequency and intensity of heavy rainfall events. One key measure of this used by climate models is the annual maximum 1 and 5-day maximum precipitation amounts (Rx1day and Rx5day, respectively). The results below indicate that while both the Rx1day and Rx5day have increased across the US since 1960....of course if we choose to go back far enough, rainfall extremes were higher than recent decades between 1901 and 1930 across both measures. Funny how weather history if we go back far enough....shows us all that is happening now has indeed occurred before. No climate emergency at all to see here! Link to comment Share on other sites More sharing options...
chubbs Posted April 6 Share Posted April 6 31 minutes ago, ChescoWx said: hey Charlie.....there is actually weather data before 1960.....some folks don't like that data - well unless we make some chilling adjustments as it does not support the non-stop warming story. The above charts I added do not omit the older data back in our last warming cycle that you likely forgot to add... I have no problem going back before 1960. I'll go way back, but I'm not going to limit myself to a small slice of biased data. No. I want to look at all the data and use the best methods to analyze it. From the last IPCC report 2 1 Link to comment Share on other sites More sharing options...
chubbs Posted April 6 Share Posted April 6 26 minutes ago, ChescoWx said: We are told by alarmists that warming of the climate is expected to increase the frequency and intensity of heavy rainfall events. One key measure of this used by climate models is the annual maximum 1 and 5-day maximum precipitation amounts (Rx1day and Rx5day, respectively). The results below indicate that while both the Rx1day and Rx5day have increased across the US since 1960....of course if we choose to go back far enough, rainfall extremes were higher than recent decades between 1901 and 1930 across both measures. Funny how weather history if we go back far enough....shows us all that is happening now has indeed occurred before. No climate emergency at all to see here! I don't know Paul, I get a different picture after checking the reference. For one thing I couldn't find the chart you posted. It must be from the supporting material. As for the body of the paper, it showed clear evidence for climate change in temperature and heavy precipitation. Below is a text snippet, a couple of charts and the paper link. This leaves me wondering where you get your information from. https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2019JD032263 1 Link to comment Share on other sites More sharing options...
ChescoWx Posted April 6 Share Posted April 6 1 hour ago, chubbs said: I have no problem going back before 1960. I'll go way back, but I'm not going to limit myself to a small slice of biased data. No. I want to look at all the data and use the best methods to analyze it. From the last IPCC report Let's go back even further Charlie! The Earth is essentially at its coolest point in the last 65 million years, but climate alarmist love to tell us the planet is “too hot” because temperatures are 1.2°C warmer than they were in 1850, the tail end of the Little Ice Age!! 1 Link to comment Share on other sites More sharing options...
Typhoon Tip Posted April 6 Share Posted April 6 Need more research publications such as this ... https://phys.org/news/2024-04-electric-vehicles-lowering-bay-area.html ...to help dispel these Big Oil propaganda narratives that it cost too much this, or is shifting the problem that - all of which are of course by design, at great convenience to their economic outlook. ' So may as well just not do anything' 2 1 Link to comment Share on other sites More sharing options...
GaWx Posted April 6 Share Posted April 6 9 hours ago, chubbs said: The article linked was more sanguine about CO2 effects than your write up. Not all of the greening is due to CO2 and increased greening is a mixed blessing. Your description of radiation effects isn't correct. Increased photosynthesis causes plants to absorb more sunlight, and reflect less, so greening generally causes warming. The effect is particularly large in the arctic where greening is mainly due to expansion of shrubs and trees northward. The greener surface absorbs much more sunlight than the snow or tundra surface it replaces. I believe these effects are included in models but am not familiar with the details. Finally here's a short interview with an ag expert, who expects a negative impact from CO2 on agriculture in most areas. https://www.theclimatebrink.com/p/is-co2-plant-food Charlie, 1) This chart was posted by Chris in his thread on the Midwest warming hole: note the reduced warming or even slight cooling in much of the MW during summer vs most other areas. This is despite large increases in crop sizes. 2) From Midwest pro meteorologist Mike Maguire: “On the albedo of global greening absorbing more sunshine and warming the planet. I can debunk that myth quickly. We know that the MOST warming is taking place in the coldest places and at the coldest times (higher latitudes during the Winter and at night). Those are also the times when albedo from the sun has the LEAST impact.” “The nights (with no sun) have been warming the most, days the least. Also, the driest locations with the lowest humidity are warming the most. This is because of the radiation physics of CO2 and H2O and not albedo. In areas with the highest water vapor content, water vapor crowds out much of the CO2 absorption from the same bands of absorption. Some of the radiation absorption bands are already saturated from H2O in areas with very high dew points. In drier areas........which includes ALL cold places, CO2 is able to absorb more long wave radiation because of the absence of H2O absorbing at the same bands. There's no disputing this proven law of radiation physics!” Link to comment Share on other sites More sharing options...
Bhs1975 Posted April 6 Share Posted April 6 Let's go back even further Charlie! The Earth is essentially at its coolest point in the last 65 million years, but climate alarmist love to tell us the planet is “too hot” because temperatures are 1.2°C warmer than they were in 1850, the tail end of the Little Ice Age!! Humans and most plants and animals living today couldn't survive in those conditions. Only cold blooded reptiles and dinosaurs that are semi cold blooded would be able to keep from over heating. If we get to 800ppm CO2 that's where we would end up.. 2 1 Link to comment Share on other sites More sharing options...
gallopinggertie Posted April 7 Share Posted April 7 8 hours ago, ChescoWx said: Let's go back even further Charlie! The Earth is essentially at its coolest point in the last 65 million years, but climate alarmist love to tell us the planet is “too hot” because temperatures are 1.2°C warmer than they were in 1850, the tail end of the Little Ice Age!! Okay but human agriculture has only been around during a very narrow range of global average temperature, which is much cooler than the vast majority of those 65 million years without reaching the coldest extremes of the current Ice Age. I don't think we want to see how our civilization would do at Eocene-like temperatures. And on top of that, it's the rapidity of the change that's so troubling. It would be much easier to cope with a ten foot sea level rise over 2,000 years than over 200 years. Not making any specific predictions here, just giving an example. Humans are adaptable, but we have our limits. 6 1 Link to comment Share on other sites More sharing options...
chubbs Posted April 7 Share Posted April 7 15 hours ago, GaWx said: Charlie, 1) This chart was posted by Chris in his thread on the Midwest warming hole: note the reduced warming or even slight cooling in much of the MW during summer vs most other areas. This is despite large increases in crop sizes. 2) From Midwest pro meteorologist Mike Maguire: “On the albedo of global greening absorbing more sunshine and warming the planet. I can debunk that myth quickly. We know that the MOST warming is taking place in the coldest places and at the coldest times (higher latitudes during the Winter and at night). Those are also the times when albedo from the sun has the LEAST impact.” “The nights (with no sun) have been warming the most, days the least. Also, the driest locations with the lowest humidity are warming the most. This is because of the radiation physics of CO2 and H2O and not albedo. In areas with the highest water vapor content, water vapor crowds out much of the CO2 absorption from the same bands of absorption. Some of the radiation absorption bands are already saturated from H2O in areas with very high dew points. In drier areas........which includes ALL cold places, CO2 is able to absorb more long wave radiation because of the absence of H2O absorbing at the same bands. There's no disputing this proven law of radiation physics!” I agree that albedo effects from fertilization are not.as important as CO2 radiation effects or plant evapotranspiration, which is important in the case of midwest corn. On-the-other hand, don't think Maguire has debunked albedo effects either. CO2 fertilization has a relatively small climate effect that needs to be evaluated carefully to determine if it is positive or negative. Found a paper (link below) which isolated the biophysical effects of CO2 fertilization on climate. CO2 fertilization had a net warming due to effect mainly due to the albedo effect of the northward advance of Boreal forests. Considered separately in the paper, increased CO2 sequestration would offset the albedo effect in the short-term but not in the long-term. In any case the climate effects of CO2 fertilization are small,and the radiation effects of CO2 are dominating. Note also that increasing CO2, increases atmospheric water vapor leading to a radiation effect that is in the same ballpark as the effect of increased CO2 alone. https://www.tandfonline.com/doi/pdf/10.1111/j.1600-0889.2006.00210.x Link to comment Share on other sites More sharing options...
Cobalt Posted April 7 Share Posted April 7 On 4/6/2024 at 12:30 PM, ChescoWx said: Let's go back even further Charlie! The Earth is essentially at its coolest point in the last 65 million years, but climate alarmist love to tell us the planet is “too hot” because temperatures are 1.2°C warmer than they were in 1850, the tail end of the Little Ice Age!! It seems like that temperature dip was necessary for human life to come around and flourish in the first place. What's true in the present day is that humanity can't support itself in an ice-free world. 2 billion people rely on glaciers for drinking water, power, and agriculture. 1 Link to comment Share on other sites More sharing options...
Bhs1975 Posted April 7 Share Posted April 7 It seems like that temperature dip was necessary for human life to come around and flourish in the first place. What's true in the present day is that humanity can't support itself in an ice-free world. 2 billion people rely on glaciers for drinking water, power, and agriculture. Yeap population will definitely drop. . Link to comment Share on other sites More sharing options...
chubbs Posted April 8 Share Posted April 8 Funny, Chesco is helping make the "alarmist" case. Googling indicates we have the same CO2 concentrations today as 14 million years ago. Only the ocean and cryosphere, which are slow to adjust to higher CO2, are keeping us close to our old climate. The good news is that the ocean will take up CO2 if we get emissions under control. We aren't committed yet to going back 14 million years. Its up to us to decide how far back in time we want to go. One thing is certain though. Ignoring the problem is going to make the future more alarming, not less. https://news.climate.columbia.edu/2023/12/07/a-new-66-million-year-history-of-carbon-dioxide-offers-little-comfort-for-today/ https://www.science.org/doi/10.1126/science.adi5177 1 Link to comment Share on other sites More sharing options...
Typhoon Tip Posted April 8 Share Posted April 8 2 hours ago, chubbs said: Funny, Chesco is helping make the "alarmist" case. Googling indicates we have the same CO2 concentrations today as 14 million years ago. Only the ocean and cryosphere, which are slow to adjust to higher CO2, are keeping us close to our old climate. The good news is that the ocean will take up CO2 if we get emissions under control. We aren't committed yet to going back 14 million years. Its up to us to decide how far back in time we want to go. One thing is certain though. Ignoring the problem is going to make the future more alarming, not less. https://news.climate.columbia.edu/2023/12/07/a-new-66-million-year-history-of-carbon-dioxide-offers-little-comfort-for-today/ https://www.science.org/doi/10.1126/science.adi5177 This missing mathematics in everyone's contribution to the present stream of dialogue is the deltas The d(CO2) since the Industrial Revolution has exceeded the positive rate of change since the paleo-antiquity. Many ...many millions of years. It is far more likely that the total manifold of the Earth system that become "climate" have not fully responded ... which makes events like last springs global sea and air spike, quite disconcerting. 1 Link to comment Share on other sites More sharing options...
csnavywx Posted April 8 Share Posted April 8 The sun was 0.6-0.7% dimmer 65 million years ago. You don't need as much CO2 for a given temperature this time around. Antarctica glaciated around 650ppm, but my guess is that due to solar luminosity increases, you'd only need around 550-600 to deglaciate it this time around. Humans do not appreciate just how late in the game we showed up and how lucky we are for CO2 to be as (relatively) low as it is now. Another 200-300m years and this planet is going to be a permanent hothouse. Edit: To correct "brighter" to "dimmer" as intended. 4 1 Link to comment Share on other sites More sharing options...
Bhs1975 Posted April 9 Share Posted April 9 The sun was 0.6-0.7% brighter 65 million years ago. You don't need as much CO2 for a given temperature this time around. Antarctica glaciated around 650ppm, but my guess is that due to solar luminosity increases, you'd only need around 550-600 to deglaciate it this time around. Humans do not appreciate just how late in the game we showed up and how lucky we are for CO2 to be as (relatively) low as it is now. Another 200-300m years and this planet is going to be a permanent hothouse.You mean dimmer in the past. High CO2 is what kept Earth from staying frozen before life evolved to help pull the CO2 out and now that the Sun is getting hotter we have to keep it below 350ppm to keep a stable climate. That major dip in CO2 at the end is when modern humans showed up and started migrating. Link to comment Share on other sites More sharing options...
TheClimateChanger Posted April 9 Share Posted April 9 On 4/6/2024 at 5:48 PM, GaWx said: Charlie, 1) This chart was posted by Chris in his thread on the Midwest warming hole: note the reduced warming or even slight cooling in much of the MW during summer vs most other areas. This is despite large increases in crop sizes. 2) From Midwest pro meteorologist Mike Maguire: “On the albedo of global greening absorbing more sunshine and warming the planet. I can debunk that myth quickly. We know that the MOST warming is taking place in the coldest places and at the coldest times (higher latitudes during the Winter and at night). Those are also the times when albedo from the sun has the LEAST impact.” “The nights (with no sun) have been warming the most, days the least. Also, the driest locations with the lowest humidity are warming the most. This is because of the radiation physics of CO2 and H2O and not albedo. In areas with the highest water vapor content, water vapor crowds out much of the CO2 absorption from the same bands of absorption. Some of the radiation absorption bands are already saturated from H2O in areas with very high dew points. In drier areas........which includes ALL cold places, CO2 is able to absorb more long wave radiation because of the absence of H2O absorbing at the same bands. There's no disputing this proven law of radiation physics!” Like the deniers always say, cherrypicked. Everybody knows the 1950s had a lot of hot summers. These statements also never account for humidity. The driest July on record in Des Moines was in 1936 (see below), which is also the hottest on record. Remarkably, the average dewpoint was even lower than the very chilly July of 2009, which had the lowest mean dewpoint of any recent years! In fact, if you look at the highest hourly heat indices on record, you will see they were mostly set just last summer. There was nothing remotely similar to the deadly combination of heat and humidity that occurred last summer during the Dust Bowl. I don't think people realize that, without widespread air conditioning, thousands of people would die each and every summer these days. In sum, there is no evidence for milder summers in the midwest; rather, dangerous heat and humidity has been on the increase in recent decades. Note - the 122F from 1936 appears to be erroneous. No other hourly heat index that day was anywhere near that measurement. Appears to be a spuriously high dewpoint reading. Regardless, 8 of the 24 highest hourly heat indices on record in Des Moines were established just last summer. Almost all of the record readings are from this century. 1 Link to comment Share on other sites More sharing options...
TheClimateChanger Posted April 9 Share Posted April 9 The bottom line is this: Even if daytime maxima have decreased slightly in the Corn Belt, humidity has increased substantially moreso. So cooler temperatures should not be read as milder. There has clearly been an increase in the incidence of dangerously high heat indices. 1 1 Link to comment Share on other sites More sharing options...
csnavywx Posted April 9 Share Posted April 9 3 hours ago, Bhs1975 said: You mean dimmer in the past. High CO2 is what kept Earth from staying frozen before life evolved to help pull the CO2 out and now that the Sun is getting hotter we have to keep it below 350ppm to keep a stable climate. That major dip in CO2 at the end is when modern humans showed up and started migrating. Yes, thank you. Corrected it. Link to comment Share on other sites More sharing options...
bdgwx Posted April 10 Share Posted April 10 6 hours ago, TheClimateChanger said: The bottom line is this: Even if daytime maxima have decreased slightly in the Corn Belt, humidity has increased substantially moreso. So cooler temperatures should not be read as milder. There has clearly been an increase in the incidence of dangerously high heat indices. Yeah. If you look at enthalpy metrics like equivalent potential temperature (theta-e) you'll see that heat (including latent) actually increased in the corn belt. 1 Link to comment Share on other sites More sharing options...
GaWx Posted April 10 Share Posted April 10 11 hours ago, TheClimateChanger said: The bottom line is this: Even if daytime maxima have decreased slightly in the Corn Belt, humidity has increased substantially moreso. So cooler temperatures should not be read as milder. There has clearly been an increase in the incidence of dangerously high heat indices. TheCCer, Is the bolded really true? Credit: Chris Kuball for the following Des Moines data from the heart of the Corn Belt ++++++++++++++++ “Out of those top 21 longest stretches of a 100+ Heat Index, only 5 have occurred since 2000. It does not go back to prior to 1936 but there were several other Dust Bowl years prior to 1936 with extreme heat in IA, as well as previous decades with extreme heat. Decade and # of times on the list 30's-1(doesn't include the very hot years 1930-35) 40s-1 50s-2 60s-1 70s-4 80s-5 90s-2 00s-1 10s-3 20s-1. In fact, last year, 2023 getting back up there in the top 21, stopped the longest streak of NOT having a year in the top 21 (11 years from 2012-2023)." The above doesn't suggest an increase in long dangerously high heat index streaks at Des Moines over the last few decades. If anything, there was a peak in the 70s to 80s. So, with regard to the longest 100+ HI streaks at Des Moines, where's the beef? 1 Link to comment Share on other sites More sharing options...
TheClimateChanger Posted April 10 Share Posted April 10 5 hours ago, GaWx said: TheCCer, Is the bolded really true? Credit: Chris Kuball for the following Des Moines data from the heart of the Corn Belt ++++++++++++++++ “Out of those top 21 longest stretches of a 100+ Heat Index, only 5 have occurred since 2000. It does not go back to prior to 1936 but there were several other Dust Bowl years prior to 1936 with extreme heat in IA, as well as previous decades with extreme heat. Decade and # of times on the list 30's-1(doesn't include the very hot years 1930-35) 40s-1 50s-2 60s-1 70s-4 80s-5 90s-2 00s-1 10s-3 20s-1. In fact, last year, 2023 getting back up there in the top 21, stopped the longest streak of NOT having a year in the top 21 (11 years from 2012-2023)." The above doesn't suggest an increase in long dangerously high heat index streaks at Des Moines over the last few decades. If anything, there was a peak in the 70s to 80s. So, with regard to the longest 100+ HI streaks at Des Moines, where's the beef? The record for 110F+ heat indices was shattered just last summer, with a clear upward trend. Admittedly, 105F is a bit noisier, although the record was set in 2011. Still a long term increase, even factoring in the spike in the 1980s. Link to comment Share on other sites More sharing options...
TheClimateChanger Posted April 10 Share Posted April 10 Here's a little more detail on the heat index trends at Des Moines. We can see despite a minimal increase in summertime maxima over the period 1936 to present, the annual hours with a heat index at or above 105F [heat advisory criteria] are increasing at a rate of about 13 hours/year per century on average. For clarification here, the unit of measurement is hours/year - the rate of increase is 0.13 hours/year per year, or 13 hours/year per century. It is true the 1980s are well represented, but five of the top 10 have occurred since 2001. 1 Link to comment Share on other sites More sharing options...
TheClimateChanger Posted April 10 Share Posted April 10 Here is the same data, but with a threshold heat index of 110F [heat warning criteria]. A heat index reading this high was very rare prior to 1980, having not exceeded 6 hours in a year until that year. Since [and inclusive of] 1980, there have been 9 years exceeding 6 hours with heat indices equal to or greater than 110F, and two additional years reaching exactly 6 hours. What happened once in the first 44 years of data has occurred 11 times in the most recent 44 years of data (or 1 out of every 4 years). Years with the most hours of heat indices at or above 110F at Des Moines, Iowa. Note last year was nearly 50% higher than any other year of record dating back to 1936. 1 Link to comment Share on other sites More sharing options...
TheClimateChanger Posted April 10 Share Posted April 10 13 hours ago, GaWx said: TheCCer, Is the bolded really true? Credit: Chris Kuball for the following Des Moines data from the heart of the Corn Belt ++++++++++++++++ “Out of those top 21 longest stretches of a 100+ Heat Index, only 5 have occurred since 2000. It does not go back to prior to 1936 but there were several other Dust Bowl years prior to 1936 with extreme heat in IA, as well as previous decades with extreme heat. Decade and # of times on the list 30's-1(doesn't include the very hot years 1930-35) 40s-1 50s-2 60s-1 70s-4 80s-5 90s-2 00s-1 10s-3 20s-1. In fact, last year, 2023 getting back up there in the top 21, stopped the longest streak of NOT having a year in the top 21 (11 years from 2012-2023)." The above doesn't suggest an increase in long dangerously high heat index streaks at Des Moines over the last few decades. If anything, there was a peak in the 70s to 80s. So, with regard to the longest 100+ HI streaks at Des Moines, where's the beef? So, yes, to answer your question, I do believe the bolded statement to be true and have presented data to support it. But I should add that I was using extreme values of 105F and 110F. It is true there isn't as significant of an upward trend in hours with heat indices at or above 100F, but these are more run-of-the-mill values for Des Moines in the summertime. I will grant you that consecutive days of 100F+ can be just as dangerous, but the statement you quoted is misleading by focusing solely on consecutive stretches rather than total number of days [plus ignoring increase at higher HX values]. It also inaccurately claims that last year ended the longest stretch with no consecutive periods of 5 days or more with 100F+ heat indices at Des Moines. This is false and rebutted by the very data submitted. There are no such stretches indicated between 1936 and 1949, which is a longer period [13 years] than 2012 to 2023 [11 years]. So I would rate Kuball's statement as misleading, and the claim that last year ended the longest stretch with no 5-day plus heat waves [using consecutive days of 100F+ HX] as pants-on-fire false. The very data he submitted shows the longest stretch to have occurred from July 15, 1936 until June 30, 1949 - a period of nearly 13 years. Link to comment Share on other sites More sharing options...
GaWx Posted April 10 Share Posted April 10 18 hours ago, TheClimateChanger said: So, yes, to answer your question, I do believe the bolded statement to be true and have presented data to support it. But I should add that I was using extreme values of 105F and 110F. It is true there isn't as significant of an upward trend in hours with heat indices at or above 100F, but these are more run-of-the-mill values for Des Moines in the summertime. I will grant you that consecutive days of 100F+ can be just as dangerous, but the statement you quoted is misleading by focusing solely on consecutive stretches rather than total number of days [plus ignoring increase at higher HX values]. It also inaccurately claims that last year ended the longest stretch with no consecutive periods of 5 days or more with 100F+ heat indices at Des Moines. This is false and rebutted by the very data submitted. There are no such stretches indicated between 1936 and 1949, which is a longer period [13 years] than 2012 to 2023 [11 years]. So I would rate Kuball's statement as misleading, and the claim that last year ended the longest stretch with no 5-day plus heat waves [using consecutive days of 100F+ HX] as pants-on-fire false. The very data he submitted shows the longest stretch to have occurred from July 15, 1936 until June 30, 1949 - a period of nearly 13 years. CCer, Now that you point it out, I agree with you that the longest stretch was, indeed, 1936-49 (13 years) vs 11 years (2012-23). Good catch. My bad for not catching that. I hope to reply to your overall posts when I get time. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now