Top of the day to you, prior to this email I did send you an email
with notice of an existing Bonded Account in my Bank ( CAPITAL ONE U.K
) with the same surname as yours.. With no response from you yet... I
was wondering if you received the email? And so I had to send you this
again.
The said BONDED ACCOUNT in the Bank here where I work has a huge
deposit in excess of $8 MILLION in it. Would like to discuss this in
detail with you when you revert back to me based on the initial email
or this one cause I can assure you 100% that this should be of immense
benefit to you and myself.
So it's imperative you revert back to me with regards to this last
email because both emails contain the same information.
Please send reply to email account- (o196708fa@yandex.com)
Yours,
Andrew Higgins
ORLY? This "Higgins" clown is someone with a surname of "Poet"?
People think that electricity moves at the speed of light. Well, the influence of electric fields moves at the speed of light. The surprising truth is that actual electrons carrying current in wires move slowly in DC systems, and scarcely move at all in AC systems.
Consider a 0 gauge aluminum wire carrying 100 amperes DC. Aluminum has an atomic weight of 26.98 and a density of 2.70, with 3 valence electrons available to conduct current. This means that 1 cc of aluminum has almost exactly 0.1 moles of atoms and about 1.8056*1023 conduction-band electrons in it.
A 0 gauge wire has a cross-sectional area of 0.535 cm², so each cm of wire has about 9.66*1022 conduction-band electrons in it. 100 amperes is 100 coulombs/second, and there are 6.2419*1018 electrons in a coulomb, so a 100-amp current means there are 6.2419*1020 electrons moving across any given cross-section per second. Dividing this number by 9.66*1022 electrons per centimeter of wire yields an electron speed of just 6.46*10-3 cm/sec, less than 100 micrometers per second!
A micrometer is small for a human, but big for an atom; the electrons are moving more than fast enough to run into many atoms per second, thus generating heat. But the contrast between the speed of electric potential waves and the actual speed of the electrons is something that people have difficulty grasping.
If someone's actions show you in no uncertain terms that they are NOT in the market for vacation rentals, or new tires, or cell phones, or working for Uber Eats, or an e-book with no description whatsoever to create interest....
STOP SHOWING THEM ADS FOR THOSE THINGS.
I am NEVER going to patronize vrbo or Uber or Belle Tire because of the obnoxiousness of your ad presentation. I don't know who was plugging the e-book, and I never will. That is to the benefit of the author, because I might give them a chance someday. Those other businesses are on my permanent shit list, because of YOU.
They should sue you for damage to reputation. You are THAT bad.
There's a downvote button (sometimes), but the absence of a "NEVER show me this ad again" button is perhaps the worst error of this ad campaign.
Adding companies to boycott as they come up:
Aspen Dental (NEVER show me this ad again, God damn you) Nutrisystem GoodRx 4Patriots ZuPoo Liberty Mutual (NEVER show me this ad again either) Equinox AirBnB monday.com
Update 5/26: Looks like Aspen Dental is the last company to get a clue that this level of in-your-faceness is offensive and pull their ads.
I may have good news on the feasibility front. It's got something to offend everyone, so it's probably decent on the merits.
I've been through the energy numbers for energy-assisted (carbon lossless) conversion of biomass to fuels, and they are surprisingly achieveable. I don't have the numbers for energy losses as sensible heat (I did a very detailed spreadsheet on that stuff for a patent application a few years ago and of course I can't find it now) but the error should be relatively small.
Here are the assumptions I started out with:
1 billion dry metric tons/yr of biomass (lignocellulose) per year (somewhat below NREL's estimate of the limits).
45% carbon by mass.
17.4 GJ/ton heat of combustion.
100% conversion to CO and H2 by gasification with steam.
The input biomass has 17.4 EJ/yr heat of combustion (initial chemical energy). Full gasification of 450 million MT of carbon with water yields 1.05 gigatons of carbon monoxide and 0.075 gigatons of hydrogen, a molecular ratio of 1:1. The difference between the energy of the input biomass and the cold syngas product is 4.15 EJ/year. This comes out to about 132 GW thermal
power, not including sensible heat losses; this is not much more than the electric output of the US nuclear fleet, so it appears highly likely that it could be supplied via electric power and the raw processing done as a distributed system. This yields a CO-H2 mix which is highly toxic, but there's a potential fix for that. If half of the CO was converted to CO2 and H2 via the water-gas shift reaction, the remaining CO could be converted to methanol using the hydrogen. The mixture of CO2 and MeOH would be far less toxic and could be shipped by pipeline, after local needs were satisfied. The MeOH is usable immediately, the CO2 is storable and provides a reservoir of carbon for reaction with H2 produced later. So long as all the carbon is
recently extracted from the atmosphere, the system would be carbon neutral.
An additional 0.075 gigatons of hydrogen is required to reach the 2:1 H:CO ratio required to make methanol or hydrocarbons, for a total of 0.150 GT of H2 overall. This yields 32.49 EJ heat of combustion, or 30.79 quads. If converted to methanol, it would yield 25.8 quads of liquid fuel. Converted to hydrocarbons via F-T synthesis, it would yield less.
In 2019, the USA only consumed about 38 quads of petroleum for all purposes. 17.2 quads of that was motor gasoline, of which at least 70% can be replaced by electricity using PHEVs. (I'm getting closer to 80%, and the infrastructure isn't really in place yet.) This reduces net petroleum consumption to roughly 26 quads all by itself.
New nuclear technologies would help. If we could engineer nuclear plants which operate at perhaps 1200°C, they could supply the required process heat directly with a reactor fleet of net power just a fraction of what we're operating today. Of course, wind and solar could assist via e.g. plasma-arc gasification of biomass as a dump load, but the nuclear pathway would have a much smaller environmental footprint (and probably lower capital costs).
Such high-temperature reactors could also drive open-cycle gas turbines which need no cooling water. I shouldn't need to mention just how well a heat-driven biomass conversion process meshes as a thermal dump load in lieu of electric generation, allowing a great deal of flexibility as to load-following on the grid.
Supplying the additional 75 mmt of H2 per year is a somewhat bigger challenge. Producing this hydrogen via electrolysis of water at 50 kWh/kg H2 requires 3750 TWh of electric power, an amount roughly equal to total annual electric consumption in the USA. Using high-temperature steam electrolysis this would be reduced by about 35%, to roughly 2440 TWh (still about 60% of annual US electric generation).
So far I'm only describing a carbon-neutral fuel scheme, but it's possible to do better. Carbon monoxide can be steam-reformed to CO2 and hydrogen, and the CO2 can potentially be sequestered. 1.05 GT of CO plus 0.675 GT of steam reacts to 1.65 GT of CO2 plus 0.075 GT of H2, with a consequent increase in HHV of 0.6 EJ. This isn't much but it isn't quite trivial either. Of course, a nuclear steam supply would be ideal for feeding a water gas shift reactor.
This also solves the energy-stockpile problem. Sudden spikes in demand such as are caused by heat waves and cold snaps are a poor match for capital-intensive energy sources. Stockpiling energy as methanol, dimethyl ether or hydrogen is one way to get full utilization out of such sources while maintaining the ability to follow demand surges. Methanol in particular is handy, as it's a room temperature liquid which can be easily cracked to CO and H2 at temperatures even LWRs can reach; then the CO can be reformed to H2 with a bit of steam. This yields a carbon-free stream of fuel which can be generated from bulk-storable stockpiles upon demand and distributed by pipeline.
Next step: learning how to use matplotlib to generate Sankey diagrams of this stuff!
Update: I found my spreadsheet of chemical properties. I also found an EPA document on the solubility of CO2 in methanol at various temperatures. Upshot: at room temperature, a 1:1 molecular ratio of MeOH and CO2 can be held as a liquid at less than 40 bar. This is definitely a mixture which can be transported either by tankers or by pipeline, retaining 100% of the carbon either for re-use or for sequestration.
On the chemical end: at 1000°C, the enthalpies of H2O, H2 and CO are -204.09, 28.08 and -79.59 kJ/mol, respectively. At 25 C the numbers for H2 and CO are 0.01 and -110.52 kJ/mol; my coefficients for calculating the enthalpy of water aren't valid at such a low temperature, so I'm going to assume -285.82 kJ/mol (heat of formation of liquid water). Assuming 0.2 mol of excess water (steam) for each mol of CO+H2 produced, there is (28.08-0.01)+(-79.59+110.52)+0.2*(-204.09+285.82) = 75.346 kJ/mol of sensible heat that is lost in the cooling of the product gas. 450 million metric tons of carbon is roughly 37.5 teramoles, for a total sensible energy loss of 2.826 EJ. Processing 450 million metric tons of biomass carbon loses sensible heat in the quenching process at a rate of 89.5 megawatts. This is a rounding error, thank goodness.
Still trying to understand the Sankey function of matplotlib.
Update 2: I slipped a decimal point and thought I was calculating joules/watts when I was actually calculating kJ/kW. The sensible heat losses will come to 89.5 GW, not MW. Some process heat reclamation will definitely be in order.
Everyone in the west has heard of the Second Law of Thermodynamics, but very few can quote it accurately and even fewer actually understand it. Mathematically, it’s very simple to state:
ΔS = ΔH/Tabs
In words, the change in entropy equals the change in enthalpy (another hard-to-grasp subject, but it’s got the units of energy) over the absolute temperature. This, incidentally, is why the Carnot limit for thermal efficiency is (Tsource - Tsink) / Tsource; the entropy of the heat coming from the source equals the entropy of the heat going to the sink, so there is no net increase in entropy. No heat engine can achieve anything as good as the Carnot limit, and most are much worse. You can generate electricity, which has almost zero entropy, from heat, but you have to throw out lots of waste heat carrying all the entropy in the inputs, plus more entropy generated in the process, with it.
There are many, many natural processes in which local entropy decreases. Consider the freezing of water. Liquid water is a highly disordered substance, while ice is largely organized into 6-molecule rings; ice has far less entropy than the water does. So where does this entropy go? The answer is, it goes with the heat. The heat of fusion of 80 calories per gram, divided by the freezing point of 273.15 K, equals the difference of entropy between water and ice. If the temperature is lower than the freezing point, the heat which seeps into the environment from the freezing water has more entropy than the water loses in forming ice, so net entropy increases; if the temperature is higher than the freezing point, the heat coming from the environment to thaw the ice has less entropy than the ice gains in forming water, so net entropy increases. The Second Law is obeyed in both cases.
What happens to this entropy? It doesn’t accumulate on earth; it gets radiated out into space along with the escaping heat. The entropy of the universe in general increases, but the entropy of earth is DECREASING as e.g. heat from the core and mantle seeps up to the surface and is lost.
This applies to more complex processes as well. Consider the bete noir of all “2LOT makes evolution impossible” zealots: the green plant. A plant takes high-entropy CO2 and water and turns it into low-entropy sugars and amino acids and phenols and lipids, from which it builds even lower-entropy cellulose, lignin and cells. Since we don’t have any processes which can decrease entropy in general, where is it going? It’s expelled from the plant as low-entropy sunlight (effectively temperature of about 5700 K) is converted into high-entropy ambient heat. Every step from photon to excited chlorophyll molecule to ATP to splitting water to reducing CO2, making sugars, etc. is irreversible; it generates entropy. But the plant and its environment becomes MORE orderly as gas and water is assembled into organized molecules and larger units; the entropy is dissipated as heat and ultimately radiated off to space.
The Miller-Urey experiment, repeated many times over, is more proof that the 2LOT is no obstacle to local decreases in entropy. Converting water and light gases into a soup of sugars, amino acids and nucleotides is most definitely a decrease in entropy, but it’s far outweighed by the entropy generated as the electricity for the spark or UV light is converted to ambient heat. The key take-home fact is that entropy doesn’t stay inside the reaction vessel. The contents can become more orderly over time.
I’ve never encountered a single anti-evolutionist who can address these facts. It’s a mind-killer for them; it threatens their entire concept of self. They’re as bad as social justice warriors in that regard, and that’s saying something. When irrefutable facts are rejected as a moral issue, that’s a problem.
It was obvious from the outset that the premises behind the Energiewende were false (and possibly fraudulent). While building out huge amounts of wind and PV generation, Germany has thus far propped up the house of cards with massive subsidies. Now those subsidies are expiring, and wonder of wonders, look what is happening:
Around 5,000 wind turbines with a total output of 3.7 gigawatts (GW) will fall out of the 20-year EEG subsidy regime at the end of the year. Now the operators have another problem: Due to the corona pandemic, prices on the electricity market have dropped drastically, which means that the vast majority of older wind farms lack the prospect of economic viability. They face shutdown and the German electricity mix is threatened with the loss of considerable amounts of green electricity.
The amounts of electricity from the old wind turbines could be sold on the stock exchange or directly to energy suppliers. However, due to the corona crisis and the falling wholesale price for natural gas there has been a significant drop in prices on the electricity market in recent months. Last July, a megawatt hour of electricity cost over 53 euros, on March 23 it was just under 34 euros. The costs for the continued operation of the systems can hardly be recorded even at the cheapest locations, there are massive shutdowns.
Germany was betting that those wind farms would continue to operate and continue to generate carbon-free electricity, likely long into the future. But the skeptics were right: as soon as the subsidies stop, so do the pinwheels. The carbon impact of this is unclear, but if 3.7 GW of wind at 21.7% capacity factor1 is retired and replaced with gas-fired turbines emitting 550 gCO2/kWh, that's an additional 3.87 million tons of CO2 emitted per year. That is a significant hit, increasing German CO2(e) emissions by more than half a percent over 2018's 725.7 million tons CO2(e)2. When successes are measured in fractions of a percent per year, this is yet another major blow to climate goals and shows just how wrong-headed the ideas behind the Energiewende were all along.
If you dutifully read your U.S. mainstream media, you undoubtedly have the impression that “clean” and “green” energy is rapidly sweeping all before it, and soon will supplant fossil fuels in powering our economy. After all, many major states, including California and New York, have mandated some form of “net zero” carbon emissions by 2050, or in some cases even earlier. That’s only 30 years away. And reports are everywhere that investment in “renewables,” particularly wind and solar energy, continues to soar. For example, from Reuters in January we have “U.S. clean energy investment hits new record despite Trump administration views.” In the New York Times on May 13 it’s “In a First, Renewable Energy Is Poised to Eclipse Coal in U.S.” The final victory of wind and solar over the evil fossil fuels must then be right around the corner.
Actually, that’s all a myth. The inherent high cost and unreliability of wind and solar energy mean that they are highly unlikely ever to be more than niche players in the overall energy picture. Politicians claim progressive virtue by commissioning vast farms of wind turbines and solar panels, at taxpayer or ratepayer expense, without anyone ever figuring out — or even addressing — how these things can run a fully functioning electrical grid without complete fossil fuel backup. And the electrical grid is the easy part. How about airplanes? How about steel mills? I’m looking for someone to demonstrate that this “net zero” thing is something more than a ridiculous fantasy, but I can’t find it.
As I like to say, "I'm from Missouri. Show me."
The fact is that Germany has nowhere further to go by building more wind and solar facilities. When the wind blows on a sunny day, they already have more power than they can use, and they are forced to give it away to Poland (or even pay the Poles to take it). On a calm night, no matter how much wind and solar they build, it all produces nothing. Without the coal plant, the lights go out. Talk about climate virtue all they want, but no one has yet even begun to work on a solution to get past this hurdle.
Which brings me to the most important piece in the GWPF email, from Cambridge Professor Michael Kelly, appearing in something called CapX on June 8, with the headline “Until we get a proper roadmap, Net Zero is a goal without a plan.”
Been saying this too.
Read enough of this stuff and you gradually realize that almost everything you read about supposed solutions to climate change is completely delusional.
RTWT, and follow any link that looks good. You'll be glad you did.
Now I come to the fraud that is Renewable Energy Certificates. Plainly put, they are a way to claim virtue for being "renewable" while still relying on a dirty grid. Real Clear Energy just called out New England on it:
New Englanders like the idea of wind energy they just don’t want any wind turbines in New England. So they are putting them in New York.
For proof of that, consider the 126-megawatt Cassadaga Wind Project, now being built in Chautauqua County, New York’s westernmost county.
...
In an email, a spokesperson for Innogy confirmed that the buyer of the power to be produced by Cassadaga “is a group of seven New England utilities procured through the New England Clean Energy request for proposals” in 2016. How will the juice from New York get to New England? It won’t. Instead, the Innogy spokesperson told me that the energy produced by the turbines at Cassadaga “will be used to serve local energy requirements in areas surrounding the project. Export to areas outside New York would require dedicated point-to-point transmission lines.”
Nevertheless, thanks to the magic of renewable-energy credits, New England utilities will get to claim the wind energy that’s being produced in Chautauqua County, as their own. The Innogy spokesperson said the utilities, “can purchase the energy generated from Cassadaga Wind without having a direct point-to-point transmission connection.”
When completed, the Cassadaga project will increase the amount of renewable energy that is being generated in New York but that will be credited to New England.
It's nice to see bigger voices than mine calling out the fakery by name. More, please.
One of the glaring flaws (far more than a mere foible) of "renewables" (wind and PV) is that they are unreliable. SO unreliable, as a matter of fact, that they force the adoption of much dirtier fossil-fired generators to accommodate their output swings.
Naive greenies think that "RE" can just be thrown onto the grid, but in fact an RE-heavy grid requires different generating technologies than one with little or none. You can generally follow the normal load curve using a CCGT plant, which can be up to 64% efficient (LHV). Following the bumpiness of "renewables" mostly requires simple-cycle gas turbines (the CCGT steam systems don't like rapid power variations); the best open-cycle I've read about gets only 46% efficiency, and I recall that the single-shaft industrial models often get something like 38%. IOW, you're burning a lot more fuel for the same electric output. This puts you way behind emissions-wise.
Let's use a real-world example: the Mitsubishi-Hitachi M501JAC gas turbine, which is available in both simple-cycle and combined-cycle versions. This allows a head-to-head comparison. The single-unit combined-cycle version of the M501JAC is rated at 614 MW and 64.0% LHV efficiency. It doesn't even HAVE a specified turndown ratio, minimum rated output, rated ramp rate or startup time. One can conclude from this that it really isn't suitable for trying to follow the ups and downs of "renewables", though it can probably handle normal load curves because other steam-turbine plants have been doing it for the last century.
The simple-cycle heat rate of this unit is 7775 kJ/kWh (LHV). Since a kilowatt-hour is 3600 kJ, we just divide that by 7775 to get 0.463, or 46.3%. The rated output is 425 MW and the rated ramp rate is 42 MW/minute, or about 10% per minute; it can be turned down to 50% of rated output, so it can go from minimum to full output in 5 minutes. This can track things like surges and sags from passing clouds and weather fronts pretty well. Its startup time is specified as 30 minutes.
What you pay for this flexibility is efficiency. Going from 64.0% down to 46.3% means burning 38% more fuel. Put another way, you need to get 27.6% of your juice from emissions-free sources just to break even on the increased emissions from going from combined-cycle to simple-cycle... and that assumes that you maintain the 46.3% efficiency at lower output power, which you won't. GE makes this data very hard to find, but the efficiency of the LMS100 gas turbine drops from 44.3% at rated power down to under 40% at half rated power (the minimum). This means even MORE fuel required.
Typical capacity factors for wind are 30-40%; PV is much lower. If you're getting 30% of your juice from "renewables", and you're burning at least 38% more fuel per kWh to get the rest, you're saving less than 3.3% from the CCGT emissions figure. At low enough capacity factors, you can actually burn more fuel with the addition of "renewables" than what you could do with all-fossil.
Is it worth spending so much money for such paltry gains? Even if your wallet can stand it, can the planet?
Now, don't let it be said that there aren't ways around this. With enough excess RE capacity you can just brute-force the issue by dumping excess power to resistance heaters in a CCGT's gas turbines, substituting electricity for fossil fuel and managing the rapid power swings on the demand side. But this is going to hit the economics, and nobody even seems to be thinking that far out of the box.
Hypedrogen has been the holy grail of the renewablistas since the 1970's, when it was also one of the magic bullets that was going to solve air pollution from cars. Here we are 50 years later and we're still burning gasoline (and the oilcos have laughed all the way to the bank). The talk seems to be getting more serious recently. But is it realistic?
But the really interesting part of that article comes down at the very end:
MHPS and Magnum Development have partnered on the idea of building an electrolysis facility near the Intermountain Power Plant around Delta, Utah. The electrolysis–which uses electricity to separate water into its hydrogen and oxygen molecules–would be powered by renewable energy, such as western U.S. solar, wind and hydro. The resulting hydrogen would be stored in underground salt caverns deep beneath the Utah rocky soils.
Ducker estimated that each of those salt caverns potentially could store 150,000 to 200,000 MWh of hydrogen capacity. The area could offer dozens of those caverns, all impermeable and yielding no energy loss.
“Think of it as a really really big battery,” he said.
A really really big, if rather lossy, battery. I note that storage in salt domes avoids any issues of sulfate minerals which hydrogen could react with and be lost as hydrogen sulfide and water. You'd likely have that trouble if you tried to use old gas wells to store hydrogen. Methane is an extraordinarily stable molecule; hydrogen is not.
Let's assume, out of charity, that those numbers are the energy you could get out of the hydrogen power plant rather than the 56% larger requirement for stored energy, or the even greater figure for energy input to make it in the first place. 150,000 to 200,000 MWh of energy sure sounds like a lot if you're not familiar with the field, but it's roughly 1 week of generation from a 1000 MW power plant... of which the USA has the equivalent of about 460 running flat out on average. A reserve of 90 days of energy (what the Trump administration wants at least some plants to hold in case of fuel supply disruptions) is roughly 13 weeks, or about 1 dozen such reservoirs to supply just one 1000 MW plant. "Dozens" of reservoirs translates to just a handful of plants being able to ride through a sustained period of energy famine... such as the most populated parts of the USA endure every winter when the sun heads south. To supply the electric grid reliably you'd need close to 10,000 of them. And that wouldn't supply the requirements for heating fuel, for vehicle fuel, for industrial heat and chemical feedstock.
You'd need probably 20,000 such reservoirs just to have 90 days of energy security. The area has "dozens". The inadequacy of the resource to the task is obvious once you know where to look. And that's just one of the problems you'll face if you try to power any industrial economy on "renewables".
It's time to face facts. The "environmentalists" are demonstrably not doing the arithmetic (and it's arithmetic, not even algebra) to properly understand the magnitude of the gap between their proposals and reality. It's likely that they have been forbidden to do the arithmetic by the people who finance their organizations. Who would benefit from such betrayal? Fossil fuel interests.
Save for a few dissident organizations like Environmental Progress and the Breakthrough Institute, the environmental movement has become a front for fossil fuel interests. They've been corrupted by donor money. Do not trust them. Do not listen to them. And when their activists come to plug "renewables", call them the liars they are.
Once commonly considered a “bridge fuel,” electric utilities now must face the mathematical reality that fast-falling clean energy costs mean the bridge only leads to climate breakdown and the destruction of shareholder value.
Natural gas is being used as the backup fuel for balancing the unreliable supplies from wind and solar. It is delivered on a just-in-time basis and cannot be stockpiled, so it is inferior to coal as a buffer against supply disruptions or demand surges. However, it is currently cheaper than coal so those deficiencies are being overlooked until crises like the "polar vortex" strike.
A new report from Energy Innovation and shareholder advocacy group As You Sow outlines these evolving risks for shareholders
In other words, O'Boyle's group.
Utility investment in new natural gas infrastructure makes less and less sense from multiple angles and only compounds risks for investors, consumers, and society. New natural gas infrastructure is incompatible with a low-emissions future and faces intense economic competition from wind[1], solar[2], storage[3], and clean energy technologies[4].
Solar's output is strongly counter-cyclical to demand at northern latitudes, being the least available when it is needed the most for heating.
"Storage" meaning batteries or PHS, presumably. PHS systems are typically sized for less than 24 hours at full power and batteries generally a handful of hours at most.
What OTHER "clean energy technologies" are there? Are any of them deployed at scale? Can they be expanded? This is deliberate deception.
On with the show.
Greater scrutiny of fossil fuel infrastructure at the regulatory commission level also looms large. Financial and climate concerns have recently led several local commissions to reject utility plans for new gas power plants including Indiana, Arizona, and California. These actions point toward a future where demand for gas is limited
Unless there is another source of energy to fill in for the frequent absences of wind and solar, gas will still be required. The requirements will not be as much overall, but peak demand will remain and perhaps even increase as efficient combined-cycle plants are shut down and replaced by open-cycle peakers.
Today, new unsubsidized wind costs $28-54/megawatt-hour (MWh), and solar costs $32-44/MWh, while new combined cycle natural gas costs $44-68/MWh. In short, in almost all jurisdictions, utility-scale wind and solar are now the cheapest source of new electricity without subsidies.
Using LCOE is deliberate deception. LCOE ignores the costs of firming and backup, which O'Boyle wants everyone to ignore. The only even-somewhat valid figure of merit for unreliable generators is Levelized Avoided Cost of Energy (LACE).
For example, NV Energy’s recent procurement of 1,200 megawatts (MW) solar and 580 MW of four-hour battery storage already beats new natural gas on price. NV Energy paid $20/MWh for solar and $13/MWh for enough battery storage to shift 25% of daily energy, resulting in a total cost of $33/MWh per MWh delivered (including federal tax credits).
We're supposed to ignore "including federal tax credits" because we can all make out better by robbing someone else's taxes paid to fund our own energy consumption. Not.
While often cited as the clean energy transition’s largest barrier, it is increasingly clear new natural gas won’t be needed to ensure grid reliability.
And who's saying this?
Studies by the National Renewable Energy Laboratory, National Oceanic and Atmospheric Administration, Evolved Energy, and Vibrant Clean Energy have found that 80% or more of our electricity could be produced from renewable sources without reliability or affordability issues.
Additional gas capacity, baseload generation 'critical' to maintaining reliability: DOE analysis
Dive Brief:
A new analysis from the U.S. Department of Energy's National Energy Technology Laboratory (NETL) concludes additional natural gas pipeline capacity and baseload generation units, such as coal and nuclear, are "critical" to maintaining grid reliability and affordable electricity in the Eastern Interconnection during extreme weather events.
Coal power advocates argue that the continued retirement of coal-fired generating units threatens grid reliability and could lead to double-digit spikes in electricity prices in several wholesale markets, but clean energy advocates counter that renewables are now the cheapest energy option and can keep the grid operating reliably.
According to the NETL report, a "conservative" analysis shows investment in new pipeline capacity of more than $1 billion is needed to maintain reliability, though dual-fueled plants can partially relieve peak demand.
Dive Insight:
As more wind and solar energy comes online, the new DOE study questions whether those intermittent resources can maintain reliability in extreme weather.
“As the power sector relies more and more on natural gas and renewable sources for power generation, infrastructure must keep pace with this growth,” NETL Director Brian Anderson said in a statement.
....
NETL's report examines the near-term economic and reliability costs associated with expanding the natural gas generation network. The analysis concludes dual-fueled plants can partially relieve peak demand for natural gas, "but it will be difficult to maintain adequate fuel availability to meet that demand when more coal and nuclear resources are lost."
According to the DOE research, there is a need for additional pipeline capacity as thermal generating units are retired.
"Natural gas deliverability constraints lead to high fuel and electricity price spikes," the report find. It concludes those spikes are "exacerbated by the continued retirements of thermal units," which are expected to top 44 GW through 2024.
"Conservatively, an investment of $470 million to $1.1 billion over that already entrained in the long-haul natural gas transmission system is identified to avoid even worse outcomes," the report estimates.
Total hits for "nucl": 3.
It's obvious who's being honest, and who's lying through their teeth.
If we can grab CO2 out of the atmosphere for 2 GJ/ton, 1 TW(e) would capture 15.8 gigatons/year; at 2.5 GJ/ton you'd get almost 13 GT/yr. The world only emits about 35 GT/yr; 3 TW(e) would likely get it all, and then some. Of course the best solution is to use carbon-free (e.g. nuclear) energy to avoid generating CO2 in the first place, but if we need to reduce CO2 levels rapidly we now have something in the toolbox.
3 TW(e) or even 1 TW(e) is a lot, but since the atmosphere is global you can do CO2 capture anywhere, any time; you can e.g. overbuild nuclear and use surplus generation to scour CO2, or put floating wind farms in the wind belts like the "roaring forties" and have them grab CO2 and put it on the sea floor in bags. Excess atmospheric CO2 is rapidly becoming a problem with a real engineering solution.
I wondered what the characteristics of such a capture system might be. If this polyanthraquinone worked down to a concentration of 300 ppm given sufficient driving voltage, and consumed 110 kJ/mol (2.5 GJ/ton) in the process, what would it look like?
Assuming a Roaring Forties wind speed of no less than 8 m/s, blowing through an ocean-borne capture system with internal air speed of 5 m/s, each square meter of frontal area processes 5 m3 of air per second. At 400 ppmv CO2 concentration at the inlet and 300 ppmv at the outlet, CO2 would be removed at the rate of 0.5 l/sec. Given CO2 gas density of 1.907 g/liter at 10°C, this comes to 0.954 gCO2/m2/sec; at 2.5 GJ/ton (2.5 kJ/g) that's about 2.38 kW/m2 of collector area. Assuming the full power input is dissipated as heat, the air temperature rise through the collector would be less than 0.5°C.
12 m/s wind speed is about where most commercial wind turbines reach their rated output. The output curves, where given, do not follow a straight cubic function as one would expect from the raw physics; the rise is more rapid at low speeds and reaches maximum at an asymptote, not a corner. This makes it difficult to estimate the output at lower speeds. However, extrapolating from the cubic curve at 8 m/s, a minimum of 29.6% of rated output can be assumed. Full rated output is generated at 12 m/s which is at the maximum of typical wind speeds. Splitting the difference, an average generation of 65% seems reasonable.
Soaking up the full rated output of a 6 MW wind turbine at 2.38 kW/m2 of collector area would require 2521 m2 of collector. This is large, but hardly impossible; it's a square slightly more than 50 m on a side, compared to a machine with a rotor diameter over 200 m. Higher wind speeds might increase the flow through the collector, thus requiring less area. At full power, a 6 MW(e) wind turbine could power a collector extracting 2.4 kg/sec of CO2 from the atmosphere. That's 8640 kg/hr, 207 tons/day, 75.7 thousand tons/year; this gets to serious quantities very quickly.
So yes, it does appear likely that we can remediate the earth's atmosphere to any CO2 concentration that we deem desirable and appropriate. If we don't have the technology yet, we are well on our way to having it in time; we don't have the energy yet, but we have every reason to get it for other reasons. At this point, all we really need is the will to get the job done.
¶ 11/09/2019 09:49:00 PM1 comments
Sunday, October 13, 2019
Over at Atomic Insights, Rod Adams points us to a newly-issued US patent for an alleged "fusion device". There are heaps of problems with it (gigawatt outputs? coming mostly as hot neutrons? holy neutron activation and radiation poisoning, Batman!) but a few examples will suffice.
These jerks made the PDF as difficult as possible to process, starting with saving it as a series of page images rather than text which can be searched and copied. That's just one of the ways it emits the aroma of snake oil. Page 5:
Each conical structure 200, opposing each other in pairs, may have smoothly curved apex sections 201, and/or include assemblies of electrified grids 202 and toroidal magnetic coils 203.
Toroid coils confine their magnetic fields inside the minor radius. They have next to no magnetic field outside the minor radius. I could see a solenoid coil but toroids would simply be useless for influencing a plasma outside the toroid coil itself, and that includes the space between these so-called "fusors". This looks like fusion word salad.
In order to heat the plasma core 75 at the extreme temperatures that fusion requires, the electrically charged dynamic fusors 200, 230 generate high electromagnetic radiation by virtue of their accelerating spin.
Word salad. The mass of plasma is negligible compared to the mass of tungsten-based electrodes. The one thing I could see as a possibility is the use of mechanical twisting of a magnetic field around a diamagnetic plasma to induce currents and consequent heating, but that would require solenoid coils rather than toroidal coils.
In order to hold an electric charge of at least one Coulomb
One coulomb is an enormous amount of electric charge. Supercapacitors store multiple coulombs by way of equally enormous amounts of surface area of their virtual "plates", which are made of things like activated carbon. In a small device with discrete plates and capacitance measured in picofarads, storing a coulomb would require voltages in the billions of volts. That's in excess of the breakdown voltage of any available material and would immediately arc over. There are equally enormous energies involved. One coulomb in a gigavolt capacitor stores 5e8 joules, about 139 kWh. Forget fusion, if you can handle that you've got a killer battery. IOW, ain't gonna happen.
My impression is that this is going to be revealed shortly as Sokal Hoax III, an epic troll of both the Green energy believers and the USPTO. I wouldn't be the least bit surprised to learn that this "inventor" doesn't even exist.
Edit: Thunderf00t is a good storyteller but weak on the nuclear stuff. Here are my notes, addressed as an open letter to him:
First, you missed a completely obvious way to debunk the "5 megaton" garbage. It only took 10 megatons to completely erase the island of Elugelab in the Ivy Mike test. 5 MT would have scoured Pripyat off the ground and turned the entire Chernobyl power plant to vapor. Instead, most of the reactor building was still standing! That wasn't a megaton or even kiloton-level explosion; it was worth, at most, a few hundred pounds of TNT.
Second, you've got a whole lot of your concepts about nuclear fission pretty badly wrong.
The reason that low-enriched uranium can't make a bomb is because you literally cannot sustain a chain reaction in pure LEU, or even LEU oxide, no matter how much of it you have. The detail of "cross sections" comes to bite you; a fission neutron straight from a nucleus is about as likely to be absorbed by a U-238 nucleus that it goes near (and make no further neutrons) as it is to be absorbed in passing by a U-235 nucleus. With U-238 being vastly more abundant, fission neutrons can't replace themselves and the "reaction" has no "chain"; the chain gets broken almost immediately.
So, how did the Chicago crew create a chain reaction in natural uranium (just 0.711% U-235)? They had a MODERATOR, in the form of a big pile of relatively pure graphite bricks. The graphite, almost pure carbon, only rarely tends to absorb neutrons but does a fairly good job of slowing them down as the neutrons bounce around. And as the neutrons slow down, a funny thing happens: U-235 atoms are HUGELY more successful in catching slow ("thermal") neutrons than U-238 atoms are. When you get things slowed down JUST enough that each fissioning atom leaves neutrons that wind up splitting exactly one more atom, the chain goes unbroken: you have a self-sustaining "chain reaction". But for this to work, the moderator has to be between the fuel elements and slow neutrons down before they can get sucked up by U-238 or escape entirely.
What does this have to do with a reactor meltdown? As soon as the fuel melts and runs together, it loses the moderation because the moderator is now outside the fuel mass, not between bits of it. Ergo, the chain is broken and the reaction stops. (In reactors using water as a moderator, losing the water also shuts down the chain reaction. Chernobyl used graphite.)
But that doesn't stop the heat. The OTHER thing you neglected is that the fission reaction itself is not the only source of heat in a reactor! About 6.5% of the energy actually comes from the radioactive decay of the fission products, the daughter nuclei created by the splitting atoms. This heat does not stop when the chain reaction stops; you have to wait for the material to "cool" as the "hottest" fission products decay away. The stuff that decays the fastest releases heat the fastest, and goes away fastest. Within an hour the "afterheat" is down to 1.5%, 0.4% after a day and 0.2% after a week.
Maybe you want to re-record some of your narration on your video to get those details right. Just sayin'.
PS: No I was not drunk when I wrote this, just fat-fingered. All typos spotted have been corrected.
This paper should be shaking the world. It should have turned our radiation-exposure standards upside-down. It should have established that regular low-dose radiation exposure is our best prophylactic against both cancer and birth defects. Yet nothing of the sort has happened.
Well, what happened? FTP:
Abstract — The conventional approach for radiation protection is based on the ICRP’s linear, no threshold (LNT) model of radiation carcinogenesis, which implies that ionizing radiation is always harmful, no matter how small the dose. But a different approach can be derived from the observed health effects of the serendipitous contamination of 1700 apartments in Taiwan with cobalt-60 (T½ = 5.3 y). This experience indicates that chronic exposure of the whole body to low-dose-rate radiation, even accumulated to a high annual dose, may be beneficial to human health.
Approximately 10,000 people occupied these buildings and received an average radiation dose of 0.4 Sv, unknowingly, during a 9-20 year period. They did not suffer a higher incidence of cancer mortality, as the LNT theory would predict. On the contrary, the incidence of cancer deaths in this population was greatly reduced – to about 3 per cent of the incidence of spontaneous cancer death in the general Taiwan public. In addition, the incidence of congenital malformations was also reduced – to about 7 per cent of the incidence in the general public.
The paper contains this graph of cancer mortality:
This paper should be shaking the world. It should have turned our
radiation-exposure standards upside-down. It should have established
that regular low-dose radiation exposure is our best prophylactic
against both cancer and birth defects.
Nothing of the sort has
happened. Nothing.
Why?
Are the people in charge of our "health" evil, or just stupid?
Edit: Backup paper link https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2477708/
I find the contrary paper suspicious. It's held behind a paywall, so the basis for the conclusions will not be examined by very many people. This is exactly what a scientific fraud would do.
Three Mile Island. The name still elicits fear, forty years later. Yet the whole accident had zero casualties; there were no deaths and no injuries.
The list of energy-related accidents with greater tolls is long. Natural gas pipeline explosions have killed quite a few in just the USA alone. Oil wiped out the center of Lac Megantic in 2013, killing 47. And collisions between road vehicles and coal trains regularly kill and injure, mostly in ones and twos.
PHOENIX (AP) — Arizona's largest electric company installed massive batteries near neighborhoods with a large number of solar panels, hoping to capture some of the energy from the afternoon sun to use after dark.
Arizona Public Service has been an early adopter of battery storage technology seen as critical for the wider deployment of renewable energy and for a more resilient power grid.
But an April fire and explosion at a massive battery west of Phoenix that sent eight firefighters and a police officer to the hospital highlighted the challenges and risks that can arise as utilities prepare for the exponential growth of the technology.
Despite the very small number of units in service, this is not the first battery fire. It won't be the last, either; current plans involve many more and much bigger installations. Running up a list of casualties while being such a minor component of the electric system ought to have people asking questions, like...
"Are these things safe to have in my neighborhood?"
"Are these things safe to have anywhere?"
Anyone who dares to ask those questions, though, is bound to come under vicious attack from the proponents of "renewables". Meanwhile, those same proponents spread fear of nuclear power, despite nukes being objectively much safer than even smallish utility-scale batteries.
Evil, or just crazy? It's got to be one or the other.
The re-appearance of the ORNL direct electrocatalytic conversion of CO2 to ethanol and other products in news about ReactWell piqued my curiousity about this process. The paper describes the yields in terms of "Faradaic efficiencies" which is an unfamiliar term, but appears to relate to Coulombic efficiencies by a direct conversion factor; one Faraday is a mole (6.023*1023) of electrons, while a Coulomb is an ampere-second. Ergo, one Faraday is roughly 96485 Coulombs.
I'm interested in the mass yields and energy efficiency of the process, which requires converting from faradaic to regular physical units. First comes the required charge transfer per reaction. I calculate the stoichiometry as follows:
2 H3O+ + 2e-
➡
H2 + 2 H2O
CO2 + H2O + 2e-
➡
CO + 2 OH-
CO2 + 6 H2O + 8e-
➡
CH4 + 8 OH-
2 CO2 + 9 H2O + 12e-
➡
CH3CH2OH + 12 OH-
Given the Faradaic conversion efficiencies to various products as given in the paper, I come up with these net yields:
Product
F. yield, %
e-/reaction
mol/Faraday
g yield/mol CO2
EtOH
63.0
12
0.0525
17.31
CH4
6.8
8
0.0085
0.97
CO
5.2
2
0.0260
5.22
H2?
25.0
2
0.1250
n/a
Re-crunching this with an eye toward heat of combustion of the products:
Calculating total input energy naïvely, 96485 coulombs times 1.2 volts yields 115.8 kJ. This is clearly nonsense. Going back to the electrochemistry, the paper declares that the potential is given "vs. RHE", a reversible hydrogen electrode. The oxygen evolution reaction is going to occur at a considerably higher potential than this. The equilibrium potential of an oxygen electrode is +1.23 V vs. RHE, which sets a floor of 2.43 V on the cell voltage. Using that, 96485 coulombs times 2.43 volts yields 234.4 kJ for a maximum electricity-to-fuel efficiency of 52.3%; only 37% goes toward reducing CO2 and just 30.6% to energy in ethanol. 15.3% goes to hydrogen.
ReactWell appears to be a bio-fuels company previously specializing in biocrude production. This is a related business, as all the products of the ORNL process can be sold or used at a refinery. Oxygen can supply anything that would be otherwise fed by air separation, H2 can go straight to hydroprocessing, the CH4 replaces natural gas for SMR or process heat, and the CO can be added to the input of the reverse water-gas shift reactor in the SMR system to make more hydrogen. Maybe the efficiency is low, but when California has a low-carbon fuel standard and is paying Arizona to take its peak generation from PV, the efficiency is not such a huge factor.
The Engineer's take:
This is nowhere near the world-killing advance I thought it was when I read the first reports in 2016. The energy efficiency is just too low, and it doesn't include any overhead for CO2 capture or separating the ethanol from the aqueous medium.
As a dump load for unreliable electric generation (especially wind and PV), this might be just the ticket. So long as the catalyst is not degraded by voltage swings this process can replace expensive or difficult-to-site storage such as batteries and pumped hydro. Enough capacity and negative wholesale electric prices would be a thing of the past. Sure can't complain about that.
The Poet's take:
Electrolytics
Making booze from cee oh two
Amuses me much.
UPDATE 3/9/2019: Not a peep. Looks like he's not interested in discussing things, his mind is made up.
UPDATE 1/20/2019: Not even a reply yet. I mailed "whydidnttheylaugh@gmail.com". If anyone knows that this is/is not the correct e-mail address or has gone into disuse, let me know.
Comedian Owen Benjamin has decided that the USA never put men on the moon. (Why? Well... he's a comedian, not an aerospace engineer. There's a lot that's common knowledge in the field that someone so far from it just isn't going to know, and may have great difficulty understanding.) So I have issued him this challenge (in the comments of the video, though said comment does not appear to be visible to the public) and am repeating it here:
I don't have time to watch and dissect 70 minutes of this video plus however long the previous one is (video is NOT a medium for conveying accurate factual content) but I will make you a deal:
You contact me at the address on my blog (ergosphere dot blogspot dot com) and give me any five pieces of evidence that you like which you believe show that the moon landings were faked.
If I can explain that you misinterpreted things or that what you believe is evidence is outright wrong on at least four of them, you make a video about how the conspiracy theorists were misled by their own skepticism. I will help you write it and give you pointers to information.
I will publish the full exchange at The Ergosphere.
So. Challenge issued. I will keep you all up to date on the results.
We used to have a good relationship. I've purchased Dell laptops several times, as well as one well-loved monitor which met an ugly end in a moving accident. My main machine 2 computers ago was a Dell laptop on which I installed some flavor of Linux I've long since forgotten. It installed from an ISO I downloaded and ran like a dream for years until it experienced some age-related failure and refused to boot. A local shop pronounced it unrecoverable, so I moved on and bought another. That one (currently in use) runs Windows 7, which I steadfastly refuse to "upgrade" to anything else by Microsoft and have been too busy to try installing anything else on. I just switched to a much bigger hard drive, but I really want to recover my data from my old Linux Dell and an even older Linux machine. For this, I need Linux.
Need it. Can't do without it. Nothing else will do, full stop.
A while ago I bought a used Inspiron 5559 because Linux compatibility was a feature of that line; it was specifically advertised as an option. I did nothing with it for quite a while because Windows 10 has such a cloying abortion of a user interface and a mass of "telemetry" (spyware) beneath it. But when the time came that I HAD to get my hands dirty messing with computers I bought another hard drive for it too, figuring that now was the time to switch it over and finally get my old stuff back.
Immediately I started running into problems. Ubuntu 12.04 is listed as a compatible operating system for my machine on the Dell web site. Does that mean I can just download an ISO and go? If only! Everywhere I've turned I've been blocked, frustrated and stymied, and this frustration appears to be official Dell policy.
First thing, there are no ISOs on the Dell site. I'd be happy to pull down a few different ones and take my chances until I find one that works for me, but Dell has chosen to completely foreclose that option. Instead, everything must be done through the "OS Recovery Tool". Well, fine. I downloaded it on the Win 10 unit and ran it.
Or rather, tried to run it. I picked "Install" but it didn't appear to do anything. Searching through the cloying abortion of the Win 10 start menu I found something that looked likely, but when I ran it it created a "recovery drive" without asking me for any of the information it would require to do the install that I want. This behavior was repeatable. In frustration and anger I gave up for the evening.
Today I reformatted the flash drive, took it over to my Win 7 machine, and downloaded the recovery installer yet again. The first time it ran, it took quite a few minutes before ultimately reporting a failure in some kind of unzip process. (It won't even retry a failed operation?) I decided to try again, and after an equally long delay it reported success... but it never gave me a Linux option on the choices of OS to install, just Win10 and "SupportAssist OS Recovery". Well, maybe the recovery tool would let me install Ubuntu. I took the flash drive back to the Win 10 machine, plugged it in, hit power, pressed F12...
and I got a boot menu on which "SupportAssist" was one of the options!
At this point I remembered that I hadn't swapped out the Win10 hard drive for a clean one, and I wanted to save that drive Just In Case, so I powered down and spent some busy minutes with a screwdriver. New drive installed and machine buttoned up, I hit power and keyed F12 again.
SupportAssist was NOT on the list of options this time! Neither was USB boot. WTF? Well, maybe Secure Boot was the problem (but why not last time?). I disabled it and fired up again, which allowed me a USB boot option. That died with "Selected Boot Device Failed", behaving exactly the same on several attempts. The USB drive that had just worked a few minutes before was not working any more. Why? What did the software do to itself to make it unbootable? Stymied, frustrated and angry all over again, I went back to my Win 7 machine and used the tool to build the flash drive for the third time.
And that is where I am right now. The thrice-built USB boot stick is still giving me "Selected Boot Device Failed"; this appears to be a hard, unrecoverable error. I have tried using Disk Manager to wipe the stick and start over from scratch, but your tool appears to have locked it so that I can't remove the partition and try again, at least not under Win 7. (I can't re-use the stick for much else, either; there's a 2 GB partition and the rest "unallocated". That's malicious destruction of property.)
So here I am, Dell. What was a simple, fast process in late 2011 is an exercise in frustration and wasted time on the last day of 2018. Instead of simply giving me standard stuff like ISOs and drivers and letting me be responsible for the results, you deliberately stand in the way of me doing with MY computer as I want and need to. Thanks to you I have about $400 sunk into hardware that is useless in its current condition, including a brand-new USB flash drive that you have effectively stolen from me in any sense of getting full use out of it.
Dell, have you defrauded me? You told me that the Inspiron 5559 could run Linux, and I bought it on that representation. I've tried every way I can figure out to use YOUR tools to install YOUR approved version of Ubuntu on this machine, and I've come up empty. Was this a fuckup, or did you deliberately lie to me? What will you do to make it good?
Unless this situation turns around REAL fast, I am done with Dell computers. It's not me, it's you. Your control-freak behavior is somewhere between destructive and downright evil, and you need to get over it.
Power to Gas (P2G) is best for (solar, wind etc.) farm-scale energy storage for most farms where there is no possibility of farm-scale pumped hydro.
P2G is excellent for mopping up all the surplus farm power because any energy which P2G can store is an efficiency gain compared to the 100% loss of all curtailed generation.
Grid managers should cease paying curtailment payments and spend the same money more wisely offering incentives to farm operators to install farm-scale energy storage.
In the comment below I had the temerity to ask
Simply not generating surpluses very much or often gets rid of most spilled power too, and also the capital and operating cost of generating it. What's the goal here?
But mostly I wanted to go into the energetics in greater detail than I did there.
Present-day electrolyzers take around 43 kWh (154.8 MJ) of electricity to produce 1 kg of hydrogen. This 1 kg of hydrogen has 141.88 MJ higher heating value and 119.96 MJ lower heating value. Suppose this hydrogen is burned in a non-condensing context, such as a gas-turbine power plant or a kitchen stove. Almost 1/4 of the input energy is lost between the electrolyzer inefficiencies and the latent heat of the lost water vapor. Even if burned in a 60% efficient (LHV) CCGT, the net efficiency drops to about 45% before losses in pumping and storage are included.
What IS the goal of this exercise? Suppose for a moment that it is to displace CO2 emissions. How effective is P2G for this purpose? Well, not very. Replacing 1 kWh generated with OCGTs at 500 gCO2/kWh with 1 kWH generated with best-of-class CCGTs at 320 gCO2/kWh eliminates 180 grams of emissions. Replacing 1 kWh generated with natural gas with 1 kWH generated by P2G hydrogen eliminates... (working the units)
1 kWh / 43 kWh/kg * 119.96 MJ/kg / 50 MJ/kg(CH4) * 2750 gCO2/kg(CH4) = 153.4 gCO2 eliminated per kWh put into P2G. This will be roughly the same for any natural gas power plant, as it displaces fuel on a per-MJ LHV basis.
But that's not the end of it. What's not usually talked about is the effect of "renewables" on the rest of the generating mix. Due to the high ramp rates of wind and solar, the rest of the generation has to be highly flexible to compensate. More efficient combined-cycle plants can't ramp quickly due to thermal constraints on the steam side, and they can often only turn their output down by 30% or so. Given this (absent hydro), less-efficient open-cycle gas turbines are usually the only viable option. This cuts the maximum thermal efficiency from as high as 62% down to around 40%.
This is a bait-and-switch of enormous size. To get "renewable energy", you have to increase per-kWh emissions from the NG balancing generators on the order of 55% over what is achievable with CCGTs. Renewables would require a capacity factor around 35-36% just to break even on emissions; less than that and emissions are WORSE!
America has definitely fallen for the bait-and-switch. The job now is threefold:
Get to a metric of emissions, period. Where energy comes from is irrelevant; eliminate all portfolio standards and mandates, FITs, net metering, etc.
Aim at fuel/carbon efficiency rather than RE generation. RE which forces lower efficiency in the balancing generators can be worse than useless.
Use appropriate market design and system architecture to get efficiency plus resiliency.
There are some options out there which can easily beat 153 gCO2 savings per kWh input. The problem (if you can call it that) is that they are way, way outside the box of conventional thinking on energy matters.