Everyone in the west has heard of the Second Law of Thermodynamics, but very few can quote it accurately and even fewer actually understand it. Mathematically, it’s very simple to state:
ΔS = ΔH/Tabs
In words, the change in entropy equals the change in enthalpy (another hard-to-grasp subject, but it’s got the units of energy) over the absolute temperature. This, incidentally, is why the Carnot limit for thermal efficiency is (Tsource - Tsink) / Tsource; the entropy of the heat coming from the source equals the entropy of the heat going to the sink, so there is no net increase in entropy. No heat engine can achieve anything as good as the Carnot limit, and most are much worse. You can generate electricity, which has almost zero entropy, from heat, but you have to throw out lots of waste heat carrying all the entropy in the inputs, plus more entropy generated in the process, with it.
There are many, many natural processes in which local entropy decreases. Consider the freezing of water. Liquid water is a highly disordered substance, while ice is largely organized into 6-molecule rings; ice has far less entropy than the water does. So where does this entropy go? The answer is, it goes with the heat. The heat of fusion of 80 calories per gram, divided by the freezing point of 273.15 K, equals the difference of entropy between water and ice. If the temperature is lower than the freezing point, the heat which seeps into the environment from the freezing water has more entropy than the water loses in forming ice, so net entropy increases; if the temperature is higher than the freezing point, the heat coming from the environment to thaw the ice has less entropy than the ice gains in forming water, so net entropy increases. The Second Law is obeyed in both cases.
What happens to this entropy? It doesn’t accumulate on earth; it gets radiated out into space along with the escaping heat. The entropy of the universe in general increases, but the entropy of earth is DECREASING as e.g. heat from the core and mantle seeps up to the surface and is lost.
This applies to more complex processes as well. Consider the bete noir of all “2LOT makes evolution impossible” zealots: the green plant. A plant takes high-entropy CO2 and water and turns it into low-entropy sugars and amino acids and phenols and lipids, from which it builds even lower-entropy cellulose, lignin and cells. Since we don’t have any processes which can decrease entropy in general, where is it going? It’s expelled from the plant as low-entropy sunlight (effectively temperature of about 5700 K) is converted into high-entropy ambient heat. Every step from photon to excited chlorophyll molecule to ATP to splitting water to reducing CO2, making sugars, etc. is irreversible; it generates entropy. But the plant and its environment becomes MORE orderly as gas and water is assembled into organized molecules and larger units; the entropy is dissipated as heat and ultimately radiated off to space.
The Miller-Urey experiment, repeated many times over, is more proof that the 2LOT is no obstacle to local decreases in entropy. Converting water and light gases into a soup of sugars, amino acids and nucleotides is most definitely a decrease in entropy, but it’s far outweighed by the entropy generated as the electricity for the spark or UV light is converted to ambient heat. The key take-home fact is that entropy doesn’t stay inside the reaction vessel. The contents can become more orderly over time.
I’ve never encountered a single anti-evolutionist who can address these facts. It’s a mind-killer for them; it threatens their entire concept of self. They’re as bad as social justice warriors in that regard, and that’s saying something. When irrefutable facts are rejected as a moral issue, that’s a problem.
It was obvious from the outset that the premises behind the Energiewende were false (and possibly fraudulent). While building out huge amounts of wind and PV generation, Germany has thus far propped up the house of cards with massive subsidies. Now those subsidies are expiring, and wonder of wonders, look what is happening:
Around 5,000 wind turbines with a total output of 3.7 gigawatts (GW) will fall out of the 20-year EEG subsidy regime at the end of the year. Now the operators have another problem: Due to the corona pandemic, prices on the electricity market have dropped drastically, which means that the vast majority of older wind farms lack the prospect of economic viability. They face shutdown and the German electricity mix is threatened with the loss of considerable amounts of green electricity.
The amounts of electricity from the old wind turbines could be sold on the stock exchange or directly to energy suppliers. However, due to the corona crisis and the falling wholesale price for natural gas there has been a significant drop in prices on the electricity market in recent months. Last July, a megawatt hour of electricity cost over 53 euros, on March 23 it was just under 34 euros. The costs for the continued operation of the systems can hardly be recorded even at the cheapest locations, there are massive shutdowns.
Germany was betting that those wind farms would continue to operate and continue to generate carbon-free electricity, likely long into the future. But the skeptics were right: as soon as the subsidies stop, so do the pinwheels. The carbon impact of this is unclear, but if 3.7 GW of wind at 21.7% capacity factor1 is retired and replaced with gas-fired turbines emitting 550 gCO2/kWh, that's an additional 3.87 million tons of CO2 emitted per year. That is a significant hit, increasing German CO2(e) emissions by more than half a percent over 2018's 725.7 million tons CO2(e)2. When successes are measured in fractions of a percent per year, this is yet another major blow to climate goals and shows just how wrong-headed the ideas behind the Energiewende were all along.
If you dutifully read your U.S. mainstream media, you undoubtedly have the impression that “clean” and “green” energy is rapidly sweeping all before it, and soon will supplant fossil fuels in powering our economy. After all, many major states, including California and New York, have mandated some form of “net zero” carbon emissions by 2050, or in some cases even earlier. That’s only 30 years away. And reports are everywhere that investment in “renewables,” particularly wind and solar energy, continues to soar. For example, from Reuters in January we have “U.S. clean energy investment hits new record despite Trump administration views.” In the New York Times on May 13 it’s “In a First, Renewable Energy Is Poised to Eclipse Coal in U.S.” The final victory of wind and solar over the evil fossil fuels must then be right around the corner.
Actually, that’s all a myth. The inherent high cost and unreliability of wind and solar energy mean that they are highly unlikely ever to be more than niche players in the overall energy picture. Politicians claim progressive virtue by commissioning vast farms of wind turbines and solar panels, at taxpayer or ratepayer expense, without anyone ever figuring out — or even addressing — how these things can run a fully functioning electrical grid without complete fossil fuel backup. And the electrical grid is the easy part. How about airplanes? How about steel mills? I’m looking for someone to demonstrate that this “net zero” thing is something more than a ridiculous fantasy, but I can’t find it.
As I like to say, "I'm from Missouri. Show me."
The fact is that Germany has nowhere further to go by building more wind and solar facilities. When the wind blows on a sunny day, they already have more power than they can use, and they are forced to give it away to Poland (or even pay the Poles to take it). On a calm night, no matter how much wind and solar they build, it all produces nothing. Without the coal plant, the lights go out. Talk about climate virtue all they want, but no one has yet even begun to work on a solution to get past this hurdle.
Which brings me to the most important piece in the GWPF email, from Cambridge Professor Michael Kelly, appearing in something called CapX on June 8, with the headline “Until we get a proper roadmap, Net Zero is a goal without a plan.”
Been saying this too.
Read enough of this stuff and you gradually realize that almost everything you read about supposed solutions to climate change is completely delusional.
RTWT, and follow any link that looks good. You'll be glad you did.
Now I come to the fraud that is Renewable Energy Certificates. Plainly put, they are a way to claim virtue for being "renewable" while still relying on a dirty grid. Real Clear Energy just called out New England on it:
New Englanders like the idea of wind energy they just don’t want any wind turbines in New England. So they are putting them in New York.
For proof of that, consider the 126-megawatt Cassadaga Wind Project, now being built in Chautauqua County, New York’s westernmost county.
In an email, a spokesperson for Innogy confirmed that the buyer of the power to be produced by Cassadaga “is a group of seven New England utilities procured through the New England Clean Energy request for proposals” in 2016. How will the juice from New York get to New England? It won’t. Instead, the Innogy spokesperson told me that the energy produced by the turbines at Cassadaga “will be used to serve local energy requirements in areas surrounding the project. Export to areas outside New York would require dedicated point-to-point transmission lines.”
Nevertheless, thanks to the magic of renewable-energy credits, New England utilities will get to claim the wind energy that’s being produced in Chautauqua County, as their own. The Innogy spokesperson said the utilities, “can purchase the energy generated from Cassadaga Wind without having a direct point-to-point transmission connection.”
When completed, the Cassadaga project will increase the amount of renewable energy that is being generated in New York but that will be credited to New England.
It's nice to see bigger voices than mine calling out the fakery by name. More, please.
One of the glaring flaws (far more than a mere foible) of "renewables" (wind and PV) is that they are unreliable. SO unreliable, as a matter of fact, that they force the adoption of much dirtier fossil-fired generators to accommodate their output swings.
Naive greenies think that "RE" can just be thrown onto the grid, but in fact an RE-heavy grid requires different generating technologies than one with little or none. You can generally follow the normal load curve using a CCGT plant, which can be up to 64% efficient (LHV). Following the bumpiness of "renewables" mostly requires simple-cycle gas turbines (the CCGT steam systems don't like rapid power variations); the best open-cycle I've read about gets only 46% efficiency, and I recall that the single-shaft industrial models often get something like 38%. IOW, you're burning a lot more fuel for the same electric output. This puts you way behind emissions-wise.
Let's use a real-world example: the Mitsubishi-Hitachi M501JAC gas turbine, which is available in both simple-cycle and combined-cycle versions. This allows a head-to-head comparison. The single-unit combined-cycle version of the M501JAC is rated at 614 MW and 64.0% LHV efficiency. It doesn't even HAVE a specified turndown ratio, minimum rated output, rated ramp rate or startup time. One can conclude from this that it really isn't suitable for trying to follow the ups and downs of "renewables", though it can probably handle normal load curves because other steam-turbine plants have been doing it for the last century.
The simple-cycle heat rate of this unit is 7775 kJ/kWh (LHV). Since a kilowatt-hour is 3600 kJ, we just divide that by 7775 to get 0.463, or 46.3%. The rated output is 425 MW and the rated ramp rate is 42 MW/minute, or about 10% per minute; it can be turned down to 50% of rated output, so it can go from minimum to full output in 5 minutes. This can track things like surges and sags from passing clouds and weather fronts pretty well. Its startup time is specified as 30 minutes.
What you pay for this flexibility is efficiency. Going from 64.0% down to 46.3% means burning 38% more fuel. Put another way, you need to get 27.6% of your juice from emissions-free sources just to break even on the increased emissions from going from combined-cycle to simple-cycle... and that assumes that you maintain the 46.3% efficiency at lower output power, which you won't. GE makes this data very hard to find, but the efficiency of the LMS100 gas turbine drops from 44.3% at rated power down to under 40% at half rated power (the minimum). This means even MORE fuel required.
Typical capacity factors for wind are 30-40%; PV is much lower. If you're getting 30% of your juice from "renewables", and you're burning at least 38% more fuel per kWh to get the rest, you're saving less than 3.3% from the CCGT emissions figure. At low enough capacity factors, you can actually burn more fuel with the addition of "renewables" than what you could do with all-fossil.
Is it worth spending so much money for such paltry gains? Even if your wallet can stand it, can the planet?
Now, don't let it be said that there aren't ways around this. With enough excess RE capacity you can just brute-force the issue by dumping excess power to resistance heaters in a CCGT's gas turbines, substituting electricity for fossil fuel and managing the rapid power swings on the demand side. But this is going to hit the economics, and nobody even seems to be thinking that far out of the box.
Hypedrogen has been the holy grail of the renewablistas since the 1970's, when it was also one of the magic bullets that was going to solve air pollution from cars. Here we are 50 years later and we're still burning gasoline (and the oilcos have laughed all the way to the bank). The talk seems to be getting more serious recently. But is it realistic?
But the really interesting part of that article comes down at the very end:
MHPS and Magnum Development have partnered on the idea of building an electrolysis facility near the Intermountain Power Plant around Delta, Utah. The electrolysis–which uses electricity to separate water into its hydrogen and oxygen molecules–would be powered by renewable energy, such as western U.S. solar, wind and hydro. The resulting hydrogen would be stored in underground salt caverns deep beneath the Utah rocky soils.
Ducker estimated that each of those salt caverns potentially could store 150,000 to 200,000 MWh of hydrogen capacity. The area could offer dozens of those caverns, all impermeable and yielding no energy loss.
“Think of it as a really really big battery,” he said.
A really really big, if rather lossy, battery. I note that storage in salt domes avoids any issues of sulfate minerals which hydrogen could react with and be lost as hydrogen sulfide and water. You'd likely have that trouble if you tried to use old gas wells to store hydrogen. Methane is an extraordinarily stable molecule; hydrogen is not.
Let's assume, out of charity, that those numbers are the energy you could get out of the hydrogen power plant rather than the 56% larger requirement for stored energy, or the even greater figure for energy input to make it in the first place. 150,000 to 200,000 MWh of energy sure sounds like a lot if you're not familiar with the field, but it's roughly 1 week of generation from a 1000 MW power plant... of which the USA has the equivalent of about 460 running flat out on average. A reserve of 90 days of energy (what the Trump administration wants at least some plants to hold in case of fuel supply disruptions) is roughly 13 weeks, or about 1 dozen such reservoirs to supply just one 1000 MW plant. "Dozens" of reservoirs translates to just a handful of plants being able to ride through a sustained period of energy famine... such as the most populated parts of the USA endure every winter when the sun heads south. To supply the electric grid reliably you'd need close to 10,000 of them. And that wouldn't supply the requirements for heating fuel, for vehicle fuel, for industrial heat and chemical feedstock.
You'd need probably 20,000 such reservoirs just to have 90 days of energy security. The area has "dozens". The inadequacy of the resource to the task is obvious once you know where to look. And that's just one of the problems you'll face if you try to power any industrial economy on "renewables".
It's time to face facts. The "environmentalists" are demonstrably not doing the arithmetic (and it's arithmetic, not even algebra) to properly understand the magnitude of the gap between their proposals and reality. It's likely that they have been forbidden to do the arithmetic by the people who finance their organizations. Who would benefit from such betrayal? Fossil fuel interests.
Save for a few dissident organizations like Environmental Progress and the Breakthrough Institute, the environmental movement has become a front for fossil fuel interests. They've been corrupted by donor money. Do not trust them. Do not listen to them. And when their activists come to plug "renewables", call them the liars they are.
Once commonly considered a “bridge fuel,” electric utilities now must face the mathematical reality that fast-falling clean energy costs mean the bridge only leads to climate breakdown and the destruction of shareholder value.
Natural gas is being used as the backup fuel for balancing the unreliable supplies from wind and solar. It is delivered on a just-in-time basis and cannot be stockpiled, so it is inferior to coal as a buffer against supply disruptions or demand surges. However, it is currently cheaper than coal so those deficiencies are being overlooked until crises like the "polar vortex" strike.
A new report from Energy Innovation and shareholder advocacy group As You Sow outlines these evolving risks for shareholders
In other words, O'Boyle's group.
Utility investment in new natural gas infrastructure makes less and less sense from multiple angles and only compounds risks for investors, consumers, and society. New natural gas infrastructure is incompatible with a low-emissions future and faces intense economic competition from wind, solar, storage, and clean energy technologies.
Solar's output is strongly counter-cyclical to demand at northern latitudes, being the least available when it is needed the most for heating.
"Storage" meaning batteries or PHS, presumably. PHS systems are typically sized for less than 24 hours at full power and batteries generally a handful of hours at most.
What OTHER "clean energy technologies" are there? Are any of them deployed at scale? Can they be expanded? This is deliberate deception.
On with the show.
Greater scrutiny of fossil fuel infrastructure at the regulatory commission level also looms large. Financial and climate concerns have recently led several local commissions to reject utility plans for new gas power plants including Indiana, Arizona, and California. These actions point toward a future where demand for gas is limited
Unless there is another source of energy to fill in for the frequent absences of wind and solar, gas will still be required. The requirements will not be as much overall, but peak demand will remain and perhaps even increase as efficient combined-cycle plants are shut down and replaced by open-cycle peakers.
Today, new unsubsidized wind costs $28-54/megawatt-hour (MWh), and solar costs $32-44/MWh, while new combined cycle natural gas costs $44-68/MWh. In short, in almost all jurisdictions, utility-scale wind and solar are now the cheapest source of new electricity without subsidies.
Using LCOE is deliberate deception. LCOE ignores the costs of firming and backup, which O'Boyle wants everyone to ignore. The only even-somewhat valid figure of merit for unreliable generators is Levelized Avoided Cost of Energy (LACE).
For example, NV Energy’s recent procurement of 1,200 megawatts (MW) solar and 580 MW of four-hour battery storage already beats new natural gas on price. NV Energy paid $20/MWh for solar and $13/MWh for enough battery storage to shift 25% of daily energy, resulting in a total cost of $33/MWh per MWh delivered (including federal tax credits).
We're supposed to ignore "including federal tax credits" because we can all make out better by robbing someone else's taxes paid to fund our own energy consumption. Not.
While often cited as the clean energy transition’s largest barrier, it is increasingly clear new natural gas won’t be needed to ensure grid reliability.
And who's saying this?
Studies by the National Renewable Energy Laboratory, National Oceanic and Atmospheric Administration, Evolved Energy, and Vibrant Clean Energy have found that 80% or more of our electricity could be produced from renewable sources without reliability or affordability issues.
Additional gas capacity, baseload generation 'critical' to maintaining reliability: DOE analysis
A new analysis from the U.S. Department of Energy's National Energy Technology Laboratory (NETL) concludes additional natural gas pipeline capacity and baseload generation units, such as coal and nuclear, are "critical" to maintaining grid reliability and affordable electricity in the Eastern Interconnection during extreme weather events.
Coal power advocates argue that the continued retirement of coal-fired generating units threatens grid reliability and could lead to double-digit spikes in electricity prices in several wholesale markets, but clean energy advocates counter that renewables are now the cheapest energy option and can keep the grid operating reliably.
According to the NETL report, a "conservative" analysis shows investment in new pipeline capacity of more than $1 billion is needed to maintain reliability, though dual-fueled plants can partially relieve peak demand.
As more wind and solar energy comes online, the new DOE study questions whether those intermittent resources can maintain reliability in extreme weather.
“As the power sector relies more and more on natural gas and renewable sources for power generation, infrastructure must keep pace with this growth,” NETL Director Brian Anderson said in a statement.
NETL's report examines the near-term economic and reliability costs associated with expanding the natural gas generation network. The analysis concludes dual-fueled plants can partially relieve peak demand for natural gas, "but it will be difficult to maintain adequate fuel availability to meet that demand when more coal and nuclear resources are lost."
According to the DOE research, there is a need for additional pipeline capacity as thermal generating units are retired.
"Natural gas deliverability constraints lead to high fuel and electricity price spikes," the report find. It concludes those spikes are "exacerbated by the continued retirements of thermal units," which are expected to top 44 GW through 2024.
"Conservatively, an investment of $470 million to $1.1 billion over that already entrained in the long-haul natural gas transmission system is identified to avoid even worse outcomes," the report estimates.
Total hits for "nucl": 3.
It's obvious who's being honest, and who's lying through their teeth.
If we can grab CO2 out of the atmosphere for 2 GJ/ton, 1 TW(e) would capture 15.8 gigatons/year; at 2.5 GJ/ton you'd get almost 13 GT/yr. The world only emits about 35 GT/yr; 3 TW(e) would likely get it all, and then some. Of course the best solution is to use carbon-free (e.g. nuclear) energy to avoid generating CO2 in the first place, but if we need to reduce CO2 levels rapidly we now have something in the toolbox.
3 TW(e) or even 1 TW(e) is a lot, but since the atmosphere is global you can do CO2 capture anywhere, any time; you can e.g. overbuild nuclear and use surplus generation to scour CO2, or put floating wind farms in the wind belts like the "roaring forties" and have them grab CO2 and put it on the sea floor in bags. Excess atmospheric CO2 is rapidly becoming a problem with a real engineering solution.
I wondered what the characteristics of such a capture system might be. If this polyanthraquinone worked down to a concentration of 300 ppm given sufficient driving voltage, and consumed 110 kJ/mol (2.5 GJ/ton) in the process, what would it look like?
Assuming a Roaring Forties wind speed of no less than 8 m/s, blowing through an ocean-borne capture system with internal air speed of 5 m/s, each square meter of frontal area processes 5 m3 of air per second. At 400 ppmv CO2 concentration at the inlet and 300 ppmv at the outlet, CO2 would be removed at the rate of 0.5 l/sec. Given CO2 gas density of 1.907 g/liter at 10°C, this comes to 0.954 gCO2/m2/sec; at 2.5 GJ/ton (2.5 kJ/g) that's about 2.38 kW/m2 of collector area. Assuming the full power input is dissipated as heat, the air temperature rise through the collector would be less than 0.5°C.
12 m/s wind speed is about where most commercial wind turbines reach their rated output. The output curves, where given, do not follow a straight cubic function as one would expect from the raw physics; the rise is more rapid at low speeds and reaches maximum at an asymptote, not a corner. This makes it difficult to estimate the output at lower speeds. However, extrapolating from the cubic curve at 8 m/s, a minimum of 29.6% of rated output can be assumed. Full rated output is generated at 12 m/s which is at the maximum of typical wind speeds. Splitting the difference, an average generation of 65% seems reasonable.
Soaking up the full rated output of a 6 MW wind turbine at 2.38 kW/m2 of collector area would require 2521 m2 of collector. This is large, but hardly impossible; it's a square slightly more than 50 m on a side, compared to a machine with a rotor diameter over 200 m. Higher wind speeds might increase the flow through the collector, thus requiring less area. At full power, a 6 MW(e) wind turbine could power a collector extracting 2.4 kg/sec of CO2 from the atmosphere. That's 8640 kg/hr, 207 tons/day, 75.7 thousand tons/year; this gets to serious quantities very quickly.
So yes, it does appear likely that we can remediate the earth's atmosphere to any CO2 concentration that we deem desirable and appropriate. If we don't have the technology yet, we are well on our way to having it in time; we don't have the energy yet, but we have every reason to get it for other reasons. At this point, all we really need is the will to get the job done.
¶ 11/09/2019 09:49:00 PM1 comments
These jerks made the PDF as difficult as possible to process, starting with saving it as a series of page images rather than text which can be searched and copied. That's just one of the ways it emits the aroma of snake oil. Page 5:
Each conical structure 200, opposing each other in pairs, may have smoothly curved apex sections 201, and/or include assemblies of electrified grids 202 and toroidal magnetic coils 203.
Toroid coils confine their magnetic fields inside the minor radius. They have next to no magnetic field outside the minor radius. I could see a solenoid coil but toroids would simply be useless for influencing a plasma outside the toroid coil itself, and that includes the space between these so-called "fusors". This looks like fusion word salad.
In order to heat the plasma core 75 at the extreme temperatures that fusion requires, the electrically charged dynamic fusors 200, 230 generate high electromagnetic radiation by virtue of their accelerating spin.
Word salad. The mass of plasma is negligible compared to the mass of tungsten-based electrodes. The one thing I could see as a possibility is the use of mechanical twisting of a magnetic field around a diamagnetic plasma to induce currents and consequent heating, but that would require solenoid coils rather than toroidal coils.
In order to hold an electric charge of at least one Coulomb
One coulomb is an enormous amount of electric charge. Supercapacitors store multiple coulombs by way of equally enormous amounts of surface area of their virtual "plates", which are made of things like activated carbon. In a small device with discrete plates and capacitance measured in picofarads, storing a coulomb would require voltages in the billions of volts. That's in excess of the breakdown voltage of any available material and would immediately arc over. There are equally enormous energies involved. One coulomb in a gigavolt capacitor stores 5e8 joules, about 139 kWh. Forget fusion, if you can handle that you've got a killer battery. IOW, ain't gonna happen.
My impression is that this is going to be revealed shortly as Sokal Hoax III, an epic troll of both the Green energy believers and the USPTO. I wouldn't be the least bit surprised to learn that this "inventor" doesn't even exist.
Edit: Thunderf00t is a good storyteller but weak on the nuclear stuff. Here are my notes, addressed as an open letter to him:
First, you missed a completely obvious way to debunk the "5 megaton" garbage. It only took 10 megatons to completely erase the island of Elugelab in the Ivy Mike test. 5 MT would have scoured Pripyat off the ground and turned the entire Chernobyl power plant to vapor. Instead, most of the reactor building was still standing! That wasn't a megaton or even kiloton-level explosion; it was worth, at most, a few hundred pounds of TNT.
Second, you've got a whole lot of your concepts about nuclear fission pretty badly wrong.
The reason that low-enriched uranium can't make a bomb is because you literally cannot sustain a chain reaction in pure LEU, or even LEU oxide, no matter how much of it you have. The detail of "cross sections" comes to bite you; a fission neutron straight from a nucleus is about as likely to be absorbed by a U-238 nucleus that it goes near (and make no further neutrons) as it is to be absorbed in passing by a U-235 nucleus. With U-238 being vastly more abundant, fission neutrons can't replace themselves and the "reaction" has no "chain"; the chain gets broken almost immediately.
So, how did the Chicago crew create a chain reaction in natural uranium (just 0.711% U-235)? They had a MODERATOR, in the form of a big pile of relatively pure graphite bricks. The graphite, almost pure carbon, only rarely tends to absorb neutrons but does a fairly good job of slowing them down as the neutrons bounce around. And as the neutrons slow down, a funny thing happens: U-235 atoms are HUGELY more successful in catching slow ("thermal") neutrons than U-238 atoms are. When you get things slowed down JUST enough that each fissioning atom leaves neutrons that wind up splitting exactly one more atom, the chain goes unbroken: you have a self-sustaining "chain reaction". But for this to work, the moderator has to be between the fuel elements and slow neutrons down before they can get sucked up by U-238 or escape entirely.
What does this have to do with a reactor meltdown? As soon as the fuel melts and runs together, it loses the moderation because the moderator is now outside the fuel mass, not between bits of it. Ergo, the chain is broken and the reaction stops. (In reactors using water as a moderator, losing the water also shuts down the chain reaction. Chernobyl used graphite.)
But that doesn't stop the heat. The OTHER thing you neglected is that the fission reaction itself is not the only source of heat in a reactor! About 6.5% of the energy actually comes from the radioactive decay of the fission products, the daughter nuclei created by the splitting atoms. This heat does not stop when the chain reaction stops; you have to wait for the material to "cool" as the "hottest" fission products decay away. The stuff that decays the fastest releases heat the fastest, and goes away fastest. Within an hour the "afterheat" is down to 1.5%, 0.4% after a day and 0.2% after a week.
Maybe you want to re-record some of your narration on your video to get those details right. Just sayin'.
PS: No I was not drunk when I wrote this, just fat-fingered. All typos spotted have been corrected.
This paper should be shaking the world. It should have turned our radiation-exposure standards upside-down. It should have established that regular low-dose radiation exposure is our best prophylactic against both cancer and birth defects. Yet nothing of the sort has happened.
Well, what happened? FTP:
Abstract — The conventional approach for radiation protection is based on the ICRP’s linear, no threshold (LNT) model of radiation carcinogenesis, which implies that ionizing radiation is always harmful, no matter how small the dose. But a different approach can be derived from the observed health effects of the serendipitous contamination of 1700 apartments in Taiwan with cobalt-60 (T½ = 5.3 y). This experience indicates that chronic exposure of the whole body to low-dose-rate radiation, even accumulated to a high annual dose, may be beneficial to human health.
Approximately 10,000 people occupied these buildings and received an average radiation dose of 0.4 Sv, unknowingly, during a 9-20 year period. They did not suffer a higher incidence of cancer mortality, as the LNT theory would predict. On the contrary, the incidence of cancer deaths in this population was greatly reduced – to about 3 per cent of the incidence of spontaneous cancer death in the general Taiwan public. In addition, the incidence of congenital malformations was also reduced – to about 7 per cent of the incidence in the general public.
The paper contains this graph of cancer mortality:
This paper should be shaking the world. It should have turned our
radiation-exposure standards upside-down. It should have established
that regular low-dose radiation exposure is our best prophylactic
against both cancer and birth defects.
Nothing of the sort has
Are the people in charge of our "health" evil, or just stupid?
Edit: Backup paper link https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2477708/
Three Mile Island. The name still elicits fear, forty years later. Yet the whole accident had zero casualties; there were no deaths and no injuries.
The list of energy-related accidents with greater tolls is long. Natural gas pipeline explosions have killed quite a few in just the USA alone. Oil wiped out the center of Lac Megantic in 2013, killing 47. And collisions between road vehicles and coal trains regularly kill and injure, mostly in ones and twos.
PHOENIX (AP) — Arizona's largest electric company installed massive batteries near neighborhoods with a large number of solar panels, hoping to capture some of the energy from the afternoon sun to use after dark.
Arizona Public Service has been an early adopter of battery storage technology seen as critical for the wider deployment of renewable energy and for a more resilient power grid.
But an April fire and explosion at a massive battery west of Phoenix that sent eight firefighters and a police officer to the hospital highlighted the challenges and risks that can arise as utilities prepare for the exponential growth of the technology.
Despite the very small number of units in service, this is not the first battery fire. It won't be the last, either; current plans involve many more and much bigger installations. Running up a list of casualties while being such a minor component of the electric system ought to have people asking questions, like...
"Are these things safe to have in my neighborhood?"
"Are these things safe to have anywhere?"
Anyone who dares to ask those questions, though, is bound to come under vicious attack from the proponents of "renewables". Meanwhile, those same proponents spread fear of nuclear power, despite nukes being objectively much safer than even smallish utility-scale batteries.
Evil, or just crazy? It's got to be one or the other.
I'm interested in the mass yields and energy efficiency of the process, which requires converting from faradaic to regular physical units. First comes the required charge transfer per reaction. I calculate the stoichiometry as follows:
2 H3O+ + 2e-
H2 + 2 H2O
CO2 + H2O + 2e-
CO + 2 OH-
CO2 + 6 H2O + 8e-
CH4 + 8 OH-
2 CO2 + 9 H2O + 12e-
CH3CH2OH + 12 OH-
Given the Faradaic conversion efficiencies to various products as given in the paper, I come up with these net yields:
F. yield, %
g yield/mol CO2
Re-crunching this with an eye toward heat of combustion of the products:
Calculating total input energy naïvely, 96485 coulombs times 1.2 volts yields 115.8 kJ. This is clearly nonsense. Going back to the electrochemistry, the paper declares that the potential is given "vs. RHE", a reversible hydrogen electrode. The oxygen evolution reaction is going to occur at a considerably higher potential than this. The equilibrium potential of an oxygen electrode is +1.23 V vs. RHE, which sets a floor of 2.43 V on the cell voltage. Using that, 96485 coulombs times 2.43 volts yields 234.4 kJ for a maximum electricity-to-fuel efficiency of 52.3%; only 37% goes toward reducing CO2 and just 30.6% to energy in ethanol. 15.3% goes to hydrogen.
ReactWell appears to be a bio-fuels company previously specializing in biocrude production. This is a related business, as all the products of the ORNL process can be sold or used at a refinery. Oxygen can supply anything that would be otherwise fed by air separation, H2 can go straight to hydroprocessing, the CH4 replaces natural gas for SMR or process heat, and the CO can be added to the input of the reverse water-gas shift reactor in the SMR system to make more hydrogen. Maybe the efficiency is low, but when California has a low-carbon fuel standard and is paying Arizona to take its peak generation from PV, the efficiency is not such a huge factor.
The Engineer's take:
This is nowhere near the world-killing advance I thought it was when I read the first reports in 2016. The energy efficiency is just too low, and it doesn't include any overhead for CO2 capture or separating the ethanol from the aqueous medium.
As a dump load for unreliable electric generation (especially wind and PV), this might be just the ticket. So long as the catalyst is not degraded by voltage swings this process can replace expensive or difficult-to-site storage such as batteries and pumped hydro. Enough capacity and negative wholesale electric prices would be a thing of the past. Sure can't complain about that.
The Poet's take:
Making booze from cee oh two
Amuses me much.
UPDATE 3/9/2019: Not a peep. Looks like he's not interested in discussing things, his mind is made up.
UPDATE 1/20/2019: Not even a reply yet. I mailed "email@example.com". If anyone knows that this is/is not the correct e-mail address or has gone into disuse, let me know.
Comedian Owen Benjamin has decided that the USA never put men on the moon. (Why? Well... he's a comedian, not an aerospace engineer. There's a lot that's common knowledge in the field that someone so far from it just isn't going to know, and may have great difficulty understanding.) So I have issued him this challenge (in the comments of the video, though said comment does not appear to be visible to the public) and am repeating it here:
I don't have time to watch and dissect 70 minutes of this video plus however long the previous one is (video is NOT a medium for conveying accurate factual content) but I will make you a deal:
You contact me at the address on my blog (ergosphere dot blogspot dot com) and give me any five pieces of evidence that you like which you believe show that the moon landings were faked.
If I can explain that you misinterpreted things or that what you believe is evidence is outright wrong on at least four of them, you make a video about how the conspiracy theorists were misled by their own skepticism. I will help you write it and give you pointers to information.
I will publish the full exchange at The Ergosphere.
So. Challenge issued. I will keep you all up to date on the results.
We used to have a good relationship. I've purchased Dell laptops several times, as well as one well-loved monitor which met an ugly end in a moving accident. My main machine 2 computers ago was a Dell laptop on which I installed some flavor of Linux I've long since forgotten. It installed from an ISO I downloaded and ran like a dream for years until it experienced some age-related failure and refused to boot. A local shop pronounced it unrecoverable, so I moved on and bought another. That one (currently in use) runs Windows 7, which I steadfastly refuse to "upgrade" to anything else by Microsoft and have been too busy to try installing anything else on. I just switched to a much bigger hard drive, but I really want to recover my data from my old Linux Dell and an even older Linux machine. For this, I need Linux.
Need it. Can't do without it. Nothing else will do, full stop.
A while ago I bought a used Inspiron 5559 because Linux compatibility was a feature of that line; it was specifically advertised as an option. I did nothing with it for quite a while because Windows 10 has such a cloying abortion of a user interface and a mass of "telemetry" (spyware) beneath it. But when the time came that I HAD to get my hands dirty messing with computers I bought another hard drive for it too, figuring that now was the time to switch it over and finally get my old stuff back.
Immediately I started running into problems. Ubuntu 12.04 is listed as a compatible operating system for my machine on the Dell web site. Does that mean I can just download an ISO and go? If only! Everywhere I've turned I've been blocked, frustrated and stymied, and this frustration appears to be official Dell policy.
First thing, there are no ISOs on the Dell site. I'd be happy to pull down a few different ones and take my chances until I find one that works for me, but Dell has chosen to completely foreclose that option. Instead, everything must be done through the "OS Recovery Tool". Well, fine. I downloaded it on the Win 10 unit and ran it.
Or rather, tried to run it. I picked "Install" but it didn't appear to do anything. Searching through the cloying abortion of the Win 10 start menu I found something that looked likely, but when I ran it it created a "recovery drive" without asking me for any of the information it would require to do the install that I want. This behavior was repeatable. In frustration and anger I gave up for the evening.
Today I reformatted the flash drive, took it over to my Win 7 machine, and downloaded the recovery installer yet again. The first time it ran, it took quite a few minutes before ultimately reporting a failure in some kind of unzip process. (It won't even retry a failed operation?) I decided to try again, and after an equally long delay it reported success... but it never gave me a Linux option on the choices of OS to install, just Win10 and "SupportAssist OS Recovery". Well, maybe the recovery tool would let me install Ubuntu. I took the flash drive back to the Win 10 machine, plugged it in, hit power, pressed F12...
and I got a boot menu on which "SupportAssist" was one of the options!
At this point I remembered that I hadn't swapped out the Win10 hard drive for a clean one, and I wanted to save that drive Just In Case, so I powered down and spent some busy minutes with a screwdriver. New drive installed and machine buttoned up, I hit power and keyed F12 again.
SupportAssist was NOT on the list of options this time! Neither was USB boot. WTF? Well, maybe Secure Boot was the problem (but why not last time?). I disabled it and fired up again, which allowed me a USB boot option. That died with "Selected Boot Device Failed", behaving exactly the same on several attempts. The USB drive that had just worked a few minutes before was not working any more. Why? What did the software do to itself to make it unbootable? Stymied, frustrated and angry all over again, I went back to my Win 7 machine and used the tool to build the flash drive for the third time.
And that is where I am right now. The thrice-built USB boot stick is still giving me "Selected Boot Device Failed"; this appears to be a hard, unrecoverable error. I have tried using Disk Manager to wipe the stick and start over from scratch, but your tool appears to have locked it so that I can't remove the partition and try again, at least not under Win 7. (I can't re-use the stick for much else, either; there's a 2 GB partition and the rest "unallocated". That's malicious destruction of property.)
So here I am, Dell. What was a simple, fast process in late 2011 is an exercise in frustration and wasted time on the last day of 2018. Instead of simply giving me standard stuff like ISOs and drivers and letting me be responsible for the results, you deliberately stand in the way of me doing with MY computer as I want and need to. Thanks to you I have about $400 sunk into hardware that is useless in its current condition, including a brand-new USB flash drive that you have effectively stolen from me in any sense of getting full use out of it.
Dell, have you defrauded me? You told me that the Inspiron 5559 could run Linux, and I bought it on that representation. I've tried every way I can figure out to use YOUR tools to install YOUR approved version of Ubuntu on this machine, and I've come up empty. Was this a fuckup, or did you deliberately lie to me? What will you do to make it good?
Unless this situation turns around REAL fast, I am done with Dell computers. It's not me, it's you. Your control-freak behavior is somewhere between destructive and downright evil, and you need to get over it.
Power to Gas (P2G) is best for (solar, wind etc.) farm-scale energy storage for most farms where there is no possibility of farm-scale pumped hydro.
P2G is excellent for mopping up all the surplus farm power because any energy which P2G can store is an efficiency gain compared to the 100% loss of all curtailed generation.
Grid managers should cease paying curtailment payments and spend the same money more wisely offering incentives to farm operators to install farm-scale energy storage.
In the comment below I had the temerity to ask
Simply not generating surpluses very much or often gets rid of most spilled power too, and also the capital and operating cost of generating it. What's the goal here?
But mostly I wanted to go into the energetics in greater detail than I did there.
Present-day electrolyzers take around 43 kWh (154.8 MJ) of electricity to produce 1 kg of hydrogen. This 1 kg of hydrogen has 141.88 MJ higher heating value and 119.96 MJ lower heating value. Suppose this hydrogen is burned in a non-condensing context, such as a gas-turbine power plant or a kitchen stove. Almost 1/4 of the input energy is lost between the electrolyzer inefficiencies and the latent heat of the lost water vapor. Even if burned in a 60% efficient (LHV) CCGT, the net efficiency drops to about 45% before losses in pumping and storage are included.
What IS the goal of this exercise? Suppose for a moment that it is to displace CO2 emissions. How effective is P2G for this purpose? Well, not very. Replacing 1 kWh generated with OCGTs at 500 gCO2/kWh with 1 kWH generated with best-of-class CCGTs at 320 gCO2/kWh eliminates 180 grams of emissions. Replacing 1 kWh generated with natural gas with 1 kWH generated by P2G hydrogen eliminates... (working the units)
1 kWh / 43 kWh/kg * 119.96 MJ/kg / 50 MJ/kg(CH4) * 2750 gCO2/kg(CH4) = 153.4 gCO2 eliminated per kWh put into P2G. This will be roughly the same for any natural gas power plant, as it displaces fuel on a per-MJ LHV basis.
But that's not the end of it. What's not usually talked about is the effect of "renewables" on the rest of the generating mix. Due to the high ramp rates of wind and solar, the rest of the generation has to be highly flexible to compensate. More efficient combined-cycle plants can't ramp quickly due to thermal constraints on the steam side, and they can often only turn their output down by 30% or so. Given this (absent hydro), less-efficient open-cycle gas turbines are usually the only viable option. This cuts the maximum thermal efficiency from as high as 62% down to around 40%.
This is a bait-and-switch of enormous size. To get "renewable energy", you have to increase per-kWh emissions from the NG balancing generators on the order of 55% over what is achievable with CCGTs. Renewables would require a capacity factor around 35-36% just to break even on emissions; less than that and emissions are WORSE!
America has definitely fallen for the bait-and-switch. The job now is threefold:
Get to a metric of emissions, period. Where energy comes from is irrelevant; eliminate all portfolio standards and mandates, FITs, net metering, etc.
Aim at fuel/carbon efficiency rather than RE generation. RE which forces lower efficiency in the balancing generators can be worse than useless.
Use appropriate market design and system architecture to get efficiency plus resiliency.
There are some options out there which can easily beat 153 gCO2 savings per kWh input. The problem (if you can call it that) is that they are way, way outside the box of conventional thinking on energy matters.
"Our case is based on science, while the opposition is based on political philosophy. When a nation whose welfare is highly dependent on technology makes vital technological decisions on the basis of political philosophy rather than on the basis of science, it is in mortal danger."
Using Pierre's numbers, 1 gallon of diesel equals 10kWh, so the overnight charge would be 7kWh equals about three quarts.
The EIA says a gallon of diesel is 137452 BTU, or just over 40 kWH(th). Converted to work in your typical light-duty engine you might get 16 kWh out of it. Your usual "convenience cord" is capable of 1440 W (120 VAC @ 12 A) so a 7-hour charge can yield as much as 10 kWh from a standard wall outlet. PHEV batteries have widely varying capacities; the Prius+ has just 4.4 kWh, the Ford Energi models started out at 7.6 kWh and are going up to 9 kWh next year, and the Pacifica plug-in has 16 kWh. These figures correspond to just over a quart, just under half a gallon and a gallon, respectively.
I used to drive a Passat TDI. I drove the automatic like a stick and averaged 38 MPG city or highway. Half a gallon of fuel would take me about 20 miles. I drive a Fusion Energi now and that's about how far the battery power will take me (depending on speed, terrain and weather of course), so that seems like a pretty fair equivalence.
you spend 18 hours charging to get energy equivalent to roughly 1.2 gallons of diesel per day.
If you had a Chrysler Pacifica charging off a standard wall outlet for 18 hours a day, you'd get up to about 1.6 gallons-equivalent. Vehicles with smaller batteries would reach full charge and have to stop; the Fusion reaches full in about 5 hours from your garden-variety wall outlet and about 90 minutes on a Level 2 charger.
1.6 gallons a day 250 days a year is 400 gallons-equivalent. The EPA-rated fuel consumption for the Pacifica hybrid is 32 MPG, so for 15,000 miles/year the expected fuel consumption is about 470 gallons. Replacing 400 of those gallons with electric power slashes the net fuel requirement by 85%. My experience is consistent. The standard drivetrain in my car is rated at 26 MPG, and I'm averaging just over 130 MPG per the dash display.
To compete with IC, you need to be able to drive hundreds of miles, with a heater blasting hot air, then fuel up in a few minutes and do it again. To get a 300 mile range, you need ten times that amount of energy, or more.
You don't need to compete with IC to replace most of your fuel. Most trips are short trips, and engines are very inefficient when cold. If you electrify most or all of the short trips and eliminate most of the cold starts, you've eliminated most of the fuel consumption with it. If you delay the engine starts until the vehicle has left the city, you get rid of the pollution generated in the city. The engine also warms up faster if run under load, improving the efficiency.
Our existing grid is generally pretty heavily loaded.
It was announced some time ago that The Energy Collective was being taken over by the Power Industry Network (energycentral.com).
Perhaps associated with this, the site had some major slowdowns and technical problems for a week or two. Then all of a sudden it just went dark, with a message that maintenance was going on. Now all blog entry links redirect to a page about re-hosting. Those discussions appear to be toast; nobody will revisit them if they ever re-appear. This follows the last transfer from the Drupal blog software, in which ham-fisted conversion destroyed most of the formatting (and thus legibility) of existing posts and comments. Heaven only knows what will be left after this new crew gets done mangling it.
The new owners don't care about human factors like... readability. Comment text is colored #8D8D8D (very light gray) on a white background. How are you supposed to read that? Do these clowns not know anyone who reads?
So far I've seen two new entries and one other comment from other TEC regulars. We'll see how many of them bother to come back. I'm betting it won't be many, as they've already found other things to do with their time.
Via a correspondent who asked to remain anonymous and unquoted, I received some screenshots of panels from a brand-new comic called Alt*Hero. Story authorship is "Vox Day", the pen-name of one Theodore Beale. He claims to have a 150-ish IQ and refers to Aristotle regularly.
I'm passingly familiar with this guy; he comes up with clever expressions. He used to get held up as an example of scientific pig-ignorance on scienceblogs, back when I read that site. So what does he write into his comic?
A 90-kilogram object with an acceleration of 3,825 kilometers per hour strikes with the force of 10 tons
In short, physics bullshit. It's bullshit from the units (acceleration is in units of distance/time²) through to the figures. Per the story, the "Redshift" character can hit supersonic speeds from a standing start in just a few meters. Figuring constant acceleration from 0 to 1000 m/s in 10 meters, the force required isn't "10 tons", it's 4.5 meganewtons (the weight of about 460 metric tons). That's almost 2 orders of magnitude greater. Impact into an immobile object would be orders of magnitude more.
Suppose you launched 90 kg to 1000 m/sec from a building. A 460-ton shock would probably cave floors in and might even knock the building down. That's physics. More to the point, a skydiver in free-fall in the arrowhead configuration reaches terminal velocity at about 200 MPH. There's no way a runner, however strong, could reach > 1 km/sec speeds by pushing against the ground (and at Mach 3 he'd broil himself from the air heated by his own shock front). Another character flies, without using wings or any other aerodynamic method. It is obvious that Physics As We Know It is not operating here (it's a comic book, Pure Fucking Magic is not just allowed but expected).
This begs the question: why the pseudo-physics bullshit? Does he not know any better?
Vox Day has long harped about scientific fraud and error, and per my correspondent even made up several new words purporting to help describe science accurately. Ironically, this supposedly 150-IQ guy can't even hack first-semester physics. (Does this extend to things like chemistry and biology? Almost certainly the latter.) He has some massive gaps in his abilities and understanding—worse, in subjects that people of much lesser IQ scores have mastered without undue difficulty. That has to sting. It looks like his enmity towards science and its practitioners comes down, not to principled differences, but a large dose of envy. Face-palm time.
(As for the comic itself, it looks fun. I may buy it, and that will make it one out of perhaps 5 I ever spent money on. But seriously, if diversity box-checking in fiction is a turn-off, so's cringeworthy stuff like this. Best just not to go there.)
Once more, good ol' Yahoo Mail has decided to screw things up totally. Their new mail form does not work on desktop. At ALL. Here's a brief list of problems:
Mail is not marked as read when read. It remains unread.
The reply links (both of them) are non-functional. I cannot reply to mail.
The top-frame buttons to delete, mark read, etc. are non-functional.
The buttons to go to previous/next conversation don't work.
Keyboard shortcuts to e.g. mark mail as read don't work either.
And to add insult to injury,
The button to bring up the options menu, where there at least used to be an option to switch back to the old Yahoo mail, is also 100% non-functional.
The "flat" UI style is considerably harder to use than the previous version, and heads should have rolled over that abortion alone. But this? If the vice president in charge isn't fired, the company is just not serious about running the service.
Oh, and Yahoo? If you're reading this, just switch everyone back to the previous version. You obviously have no idea when something is ready for prime time.
Edit: Yahoo made the new version fail only with Pale Moon. Apparently they tested with other browsers. I was able to get in and switch back to the old version using a different browser.