Ethics, Æsthetics, Ecology, Education

Story of the Hour

The proportion of harmful substances in particulate matter is much higher than assumed
Apr
1
10:00 AM10:00

The proportion of harmful substances in particulate matter is much higher than assumed

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Study co-author Alexandre Barth setting up the device that measures harmful components in particulate matter in real time. (Photo: University of Basel, Department of Environmental Sciences)

People breathing contaminated air over the course of years are at greater risk of developing numerous diseases. This is thought to be due to highly reactive components in particulate matter, which affect biological processes in the body. However, researchers from the University of Basel have now shown that precisely these components disappear within hours and that previous measurements therefore completely underestimate the quantities in which they are present.

From chronic respiratory problems to cardiovascular diseases, diabetes and dementia, health damage caused by particulate matter air pollution is wide-ranging and serious. The World Health Organization (WHO) estimates that over six million deaths a year are caused by increased exposure to particulate matter. The chemical composition of these tiny particles in the air, which come from a wide range both anthropogenic and natural sources, is highly complex. Which particles trigger which reactions and long-term diseases in the body is the subject of intensive research.

This research focuses on particularly reactive components known to experts as oxygen radicals or reactive oxygen species. These compounds can oxidize biomolecules inside and on the surface of cells in the respiratory tract, damaging them and in turn triggering inflammatory responses that impact the entire body.

Experts previously collected the particular matter on filters and analyzed the particles following a delay of days or weeks. "Since these oxygen-containing radicals react with other molecules so quickly, they should be measured without delay," says atmospheric scientist Professor Markus Kalberer, explaining the idea behind the study that he and his team recently published in Science Advances.

Measured from the air in real time

The team from the Department of Environmental Sciences has developed a new method for measuring particulate matter within seconds. This involves collecting the particles directly from the air in a liquid, where they come into contact with various chemicals. Within this solution, the oxygen radicals then react and produce quantifiable fluorescence signals.

Measurements taken with the new method reveal that 60% to 99% of oxygen radicals disappear within minutes or hours. Previous analyses of particulate matter based on filter deposition therefore delivered a distorted image. "However, since the measurement error in the case of delayed analysis isn’t constant, it’s not that possible to extrapolate from previous filter-based analyses," says Kalberer. The real proportion of harmful substances in the particulate matter is, he says, significantly higher than previously assumed.

According to the atmospheric researcher, the principal challenge with the new method was to develop a measuring instrument that carried out chemical analyses autonomously and continuously under stable conditions not only in the laboratory but also during field measurements at a wide range of locations.

Different and stronger inflammatory responses

Moreover, further laboratory analyses with epithelial cells from the lungs provided evidence that, in particular, the short-lived, highly reactive components of particulate matter have a different effect than that of the particles analyzed using the previous, delayed measurements. The short-lived reactive components in particles triggered different and stronger inflammatory responses.

In a subsequent step, the measuring instrument will be further developed in order to obtain deeper insights into the composition and effects of particulate matter. Kalberer explains: "If we can measure the proportion of highly reactive, harmful components more accurately and reliably, it will also be possible to adopt better protective measures."

Original publication
Steven J. Campbell et al.
Short-lived reactive components substantially contribute to particulate matter oxidative potential
Science Advances (2025), doi: 10.1126/sciadv.adp8100

View Event →
Scientists Reveal “A Fundamental Process in Nature” – The Environmental Rules That Plants Cannot Break
Apr
2
10:00 AM10:00

Scientists Reveal “A Fundamental Process in Nature” – The Environmental Rules That Plants Cannot Break

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

The interplay of environmental conditions and geographical barriers such as mountains and lakes determine where plants thrive – an international study shows how these patterns have developed over millions of years. Credit: Holger Kreft

By University of Göttingen March 31, 2025

Global research team explores how environmental factors and dispersal barriers influence biodiversity.

Why do certain plants flourish in some regions but not in others? A study led by researchers at the University of Göttingen sheds light on the factors that determine where plants grow and how these patterns have evolved over millions of years.

The team analyzed data from nearly 270,000 seed plant species across the globe. Their findings, published in Nature Ecology & Evolution, reveal that both environmental conditions and natural barriers to movement, such as mountains, oceans, and climate zones, play key roles in shaping global plant diversity.

To uncover these patterns, the researchers used advanced techniques that combine current plant distribution data with information about evolutionary relationships between species. They also incorporated modern environmental data and reconstructed Earth’s past climate and geography to understand how these factors have influenced plant distributions through deep time.

The team examined how variations in climate, soil, and other environmental factors determine where plants can thrive and how physical barriers – such as oceans, mountain ranges, and areas with inhospitable climates – restrict plant dispersal.

Environment and Barriers

The findings show that environmental conditions, particularly climate, are important factors in shaping plant distributions, with their influence remaining consistent across evolutionary timescales.

Physical barriers like oceans and mountains played a significant role in limiting the spread of more recently evolved plant groups but had a much smaller effect on ancient plant groups, which have had longer periods to disperse widely. Past tectonic plate positions and movements, reconstructed from geological data, were found to have only a modest impact on plant diversity, with their strongest effects occurring between 20 and 50 million years ago.

“These findings reveal a fundamental process in nature,” says Dr Lirong Cai from the University of Göttingen and the German Centre for Integrative Biodiversity Research (iDiv). “Given enough time, plants can overcome the barriers of vast distances and geography, but they often remain limited by the environments they encounter.”

Reference: “Environmental filtering, not dispersal history, explains global patterns of phylogenetic turnover in seed plants at deep evolutionary timescales” by Lirong Cai, Holger Kreft, Pierre Denelle, Amanda Taylor, Dylan Craven, Wayne Dawson, Franz Essl, Mark van Kleunen, Jan Pergl, Petr Pyšek, Marten Winter, Francisco J. Cabezas, Viktoria Wagner, Pieter B. Pelser, Jan J. Wieringa and Patrick Weigelt, 29 November 2024, Nature Ecology & Evolution.
DOI: 10.1038/s41559-024-02599-y

View Event →
Leaf Vein Architecture Allows Predictions of Past Climate
Apr
3
10:00 AM10:00

Leaf Vein Architecture Allows Predictions of Past Climate

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

The highly organized minor vein network in a leaf of a tropical forest tree, Ampelocera ruizii. UCLA research shows how the scaling of vein systems across flowering plants arises from a general developmental algorithm and explains global ecological patterns. Credit: Michael Rawls, UCLA Life Sciences

By Stuart Wolpert, University of California - Los Angeles May 24, 2012
A newly published report describes the mathematical linkages between leaf vein systems and leaf size from around the globe, improving scientists’ ability to predict and interpret climate of the deep past from leaf fossils.

University of California, Los Angeles (UCLA) life scientists have discovered new laws that determine the construction of leaf vein systems as leaves grow and evolve. These easy-to-apply mathematical rules can now be used to better predict the climates of the past using the fossil record.

The research, published May 15 in the journal Nature Communications, has a range of fundamental implications for global ecology and allows researchers to estimate original leaf sizes from just a fragment of a leaf. This will improve scientists’ prediction and interpretation of climate in the deep past from leaf fossils.

Leaf veins are of tremendous importance in a plant’s life, providing the nutrients and water that leaves need to conduct photosynthesis and supporting them in capturing sunlight. Leaf size is also of great importance for plants’ adaptation to their environment, with smaller leaves being found in drier, sunnier places.

However, little has been known about what determines the architecture of leaf veins. Mathematical linkages between leaf vein systems and leaf size have the potential to explain important natural patterns. The new UCLA research focused on these linkages for plant species distributed around the globe.

“We found extremely strong, developmentally based scaling of leaf size and venation that has remained unnoticed until now,” said Lawren Sack, a UCLA professor of ecology and evolutionary biology and lead author of the research.

How does the structure of leaf vein systems depend on leaf size? Sack and members of his laboratory observed striking patterns in several studies of just a few species. Leaf vein systems are made up of major veins (the first three branching “orders,” which are large and visible to the naked eye) and minor veins, (the mesh embedded within the leaf, which makes up most of the vein length).

Federally funded by the National Science Foundation, the team of Sack, UCLA graduate student Christine Scoffoni, three UCLA undergraduate researchers, and colleagues at other U.S. institutions measured hundreds of plant species worldwide using computer tools to focus on high-resolution images of leaves that were chemically treated and stained to allow sharp visualization of the veins.

The team discovered predictable relationships that hold across different leaves throughout the globe. Larger leaves had their major veins spaced further apart according to a clear mathematical equation, regardless of other variations in their structure (like cell size and surface hairiness) or physiological activities (like photosynthesis and respiration), Sack said.

Larger leaves have their major veins spaced farther apart. The squares show second and third-order leaf veins, matched with leaf silhouettes for given species of a Panamanian rainforest, all drawn to the same scale. UCLA research shows how the scaling of vein systems across flowering plants arises from a general developmental algorithm and explains global ecological patterns. Credit: Lawren Sack, UCLA Life Sciences

“This scaling of leaf size and major veins has strong implications and can potentially explain many observed patterns, such as why leaves tend to be smaller in drier habitats, why flowering plants have evolved to dominate the world today, and how to best predict climates of the past,” he said.

These leaf vein relationships can explain, at a global scale, the most famous biogeographical trend in plant form: the predomination of small leaves in drier and more exposed habitats. This global pattern was noted as far back as the ancient Greeks (by Theophrastus of Lesbos) and by explorers and scientists ever since. The classical explanation for why small leaves are more common in dry areas was that smaller leaves are coated by a thinner layer of still air and can therefore cool faster and prevent overheating. This would certainly be an advantage when leaves are in hot, dry environments, but it doesn’t explain why smaller leaves are found in cool, dry places too, Sack noted.

Last year, Scoffoni and Sack proposed that small leaves tend to have their major veins packed closely together, providing drought tolerance. That research, published in the journal Plant Physiology, pointed to an advantage for improving water transport during drought. To survive, leaves must open the stomatal pores on their surfaces to capture carbon dioxide, but this causes water to evaporate out of the leaves. The water must be replaced through the leaf veins, which pull up water through the stem and root from the soil. This drives a tension in the leaf vein “xylem pipes,” and if the soil becomes too dry, air can be sucked into the pipes, causing blockage.

The team had found, using computer simulations and detailed experiments on a range of plant species, that because smaller leaves have major veins that are packed closer together — a higher major vein length per leaf area — they had more “superhighways” for water transport. The greater number of major veins in smaller leaves provides drought tolerance by routing water around blockages during drought.

This explanation is strongly supported by the team’s new discovery of a striking global trend: higher major vein length per leaf area in smaller leaves.

The Nature Communications research provides a new ability to estimate leaf size from a leaf fragment and to better estimate past climate from fossil deposits that are rich in leaf fragments. Because of the very strong tendency for smaller leaves to have higher major vein length per leaf area, one can use a simple equation to estimate leaf size from fragments.

Major vein length per leaf area can be measured by anyone willing to look closely at the large and small leaves around them.

“We encourage anyone to grab a big and a small leaf from trees on the street and see for yourself that the major veins are larger and spaced further apart in the larger leaf,” Scoffoni said.

Because leaf size is used by paleobiologists to “hindcast” the rainfall experienced when those fossil plants were alive and to determine the type of ecosystem in which they existed, the ability to estimate intact leaf size from fragmentary remains will be very useful for estimates of climate and biodiversity in the fossil record, Sack said.

The research also points to a new explanation for why leaf vein evolution allowed flowering plants to take over tens of millions of years ago from earlier evolved groups, such as cycads, conifers and ferns. Because, with few exceptions, only flowering plants have densely packed minor veins, and these allow a high photosynthetic rate — providing water to keep the leaf cells hydrated and nutrients to fuel photosynthesis — flowering plants can achieve much higher photosynthetic rates than earlier evolved groups, Sack said.

The UCLA team’s new research also showed that the major and minor vein systems in the leaf evolve independently and that the relationship between these systems differs depending on life size.

“While the major veins show close relationships with leaf size, becoming more spaced apart and larger in diameter in larger leaves, the minor veins are independent of leaf size and their numbers can be high in small leaves or large leaves,” Sack said. “This uniquely gives flowering plants the ability to make large or small leaves with a wide range of photosynthetic rates. The ability of the flowering plants to achieve high minor-vein length per area across a wide range of leaf sizes allows them to adapt to a much wider range of habitats — from shade to sun, from wet to dry, from warm to cold — than any other plant group, helping them to become the dominant plants today.”

The strength of the mathematical linkage of leaf veins with leaf size across diverse species raises the question of cause.

The UCLA team explains that these patterns arise from the fact of a shared script or “program” for leaf expansion and the formation of leaf veins. The team reviewed the past 50 years of studies of isolated plant species and found striking commonalities across species in their leaf development.

“Leaves develop in two stages,” Sack said. “First, the tiny budding leaf expands slightly and slowly, and then it starts a distinct, rapid growth stage and expands to its final size.”

The major veins form during the first, slow phase of leaf growth, and their numbers are complete before the rapid expansion phase, he said. During the rapid expansion phase, those major veins are pushed apart, and can simply extend and thicken to match the leaf expansion. Minor veins can continue to be initiated in between the major veins during the rapid phase, as the growing leaf can continue to lay down new branching strands of minor veins.

In the final, mature leaf, it is possible for minor veins to be spaced closely, even in a large leaf where the major veins would be spaced apart.

“The generality of the development program is striking,” Sack said, “It’s consistent with the fact that different plant species share important vein development genes — and the global scaling patterns of leaf vein structure with leaf size emerge in consequence.”

These vein trends, confirmed with high-resolution measurements, are “obvious everywhere under our noses,” Sack and Scoffoni said.

Why had these trends escaped notice until now?

“This is the time for plants,” Sack said. “It’s amazing what is waiting to be discovered in plant biology. It seems limitless right now. The previous century is known for exciting discoveries in physics and molecular biology, but this century belongs to plant biology. Especially given the centrality of plants for food and biosphere sustainability, more attention is being focused, and the more people look, the more fundamental discoveries will be made.”

Reference: “Developmentally based scaling of leaf venation architecture explains global ecological patterns” by Lawren Sack, Christine Scoffoni, Athena D. McKown, Kristen Frole, Michael Rawls, J. Christopher Havran, Huy Tran and Thusuong Tran, 15 May 2012, Nature Communications.
DOI: 10.1038/ncomms1835

View Event →

Innovation at a price: The environmental cost of the Cal State’s new AI initiative
Apr
18
8:00 AM08:00

Innovation at a price: The environmental cost of the Cal State’s new AI initiative

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Aviv Kesar | Mustang News

by Kaylin O'Connell / April 18, 2025

Artificial intelligence is breaking new ground for Cal Poly under the Cal State AI initiative, which announced its collaboration with OpenAI to make AI tools free for Cal State students. However, the initiative raises environmental concerns due to energy use, water consumption and a lack of transparency from big tech companies.

The Cal State AI initiative negatively contributes to the environment due to the influx of new customers supporting large language models (LLMs), which are AI systems like ChatGPT that use high amounts of energy and water, according to Foaad Khosmood, a computer science & software engineering professor. 

“If you have half a million new customers, that’s more incentive for those companies and other companies to create new LLMs,” Khosmood said. “Now they think they can sell it to all these universities and make even more money.”

The Cal State initiative encourages students to use AI without knowledge on its environmental effects, according to Amara Zabback, a computer science & software engineering graduate student. Zabback gave a presentation titled “Environmental Impact of LLMs and Gen AI” in a graduate class on AI. 

“As soon as universities or university systems are encouraging [AI use], it’s just putting way more support behind it,” Zabback said. “They need to try a lot harder to hold these people accountable and hold companies accountable.”

Cal Poly remains committed to its environmental goals, according to Cal Poly Spokesperson Matt Lazier.

“As this is a new initiative, Cal Poly and the CSU will be looking closely at AI’s positive and negative impacts,” Lazier said in an email to Mustang News. “The CSU AI Strategy seeks to be a leader in the ethical, social and responsible use of AI in education, and our commitment to sustainability will be a part of those discussions.”

The bulk of AI’s environmental impact stems from the LLM training, as opposed to individuals’ ChatGPT searches, as the training creates higher carbon emissions and requires exponential amounts of water and power, according to Khosmood.

LLMs use small hardware engines to process significant power, which leads to excessive heating and requires water and refrigeration systems to keep the engines cool, Khosmood said. 

Data centers can use up to five million gallons of water daily, intensifying water shortages in hot, dry climates like Arizona and Texas, Zabback said. These areas are typically water scarce, so the data centers negatively impact nearby communities and ecosystems, she explained.

Researchers in 2019 found that an LLM could produce as much [carbon dioxide] as five cars over their lifetimes. 

“It’s a really cool and exciting technology, but I think that it needs to be used responsibly,” Zabback said. “I think people who use it need to have a profound understanding of what is actually happening.”

Major companies like Microsoft, a leading stakeholder in OpenAI, have pledged to become carbon negative by 2030, Zabback said. However, Khosmood is skeptical of the company’s claim, as carbon offset and water recycling efforts typically have little follow through or confirmed impact from companies, he explained.

“It’s really hard to pinpoint and exactly qualify [LLM’s] carbon emissions,” Zabback said. “It’s not like these data centers are out here advertising the resource use.”

Khosmood believes that increased transparency, both from big tech companies and the Cal State system, is necessary to increase public understanding about the environmental impacts of LLMs.

“When you go to the supermarket and you buy something, the ingredients are listed on there, right?” Khosmood said, holding up a bottle of Coke Zero. “This is all by law. We have to do this, right? Well, they should have to do the same thing for digital products.”

Specifically, Khosmood believes that LLM sites should be open about their training locations, carbon intakes and fine-tuning methods. He emphasized that this should exist at every level, from federal to the Cal State system. 

“[LLMs] should publish all the ingredients that went into this and what impact it’s having,” Khosmood said. “It’s not too much to ask, I think. This could be making us sick. This could be making the planet sick.”

Zabback believes that innovation does not need to come at an environmental cost; the balance is more nuanced.

“I think that we can have a lot of really good innovations that can change peoples lives and change things for the better,” Zabback said. “But I don’t think that we need to steamroll the environment in the meantime.”

Read more here

View Event →
Trump’s War on Measurement Means Losing Data on Drug Use, Maternal Mortality, Climate Change and More
Apr
18
6:00 AM06:00

Trump’s War on Measurement Means Losing Data on Drug Use, Maternal Mortality, Climate Change and More

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Still from video illustration for ProPublica

by Alec MacGillis / April 18, 2025, 6 a.m. EDT

By slashing teams that gather critical data, the administration has left the federal government with no way of understanding if policies are working — and created a black hole of information whose consequences could ripple out for decades.

More children ages 1 to 4 die of drowning than any other cause of death. Nearly a quarter of adults received mental health treatment in 2023, an increase of 3.4 million from the prior year. The number of migrants from Mexico and northern Central American countries stopped by the U.S. Border Patrol was surpassed in 2022 by the number of migrants from other nations.

We know these things because the federal government collects, organizes and shares the data behind them. Every year, year after year, workers in agencies that many of us have never heard of have been amassing the statistics that undergird decision-making at all levels of government and inform the judgments of business leaders, school administrators and medical providers nationwide.

The survival of that data is now in doubt, as a result of the Department of Government Efficiency’s comprehensive assault on the federal bureaucracy.

Reaction to those cuts has focused understandably on the hundreds of thousands of civil servants who have lost their jobs or are on the verge of doing so and the harm that millions of people could suffer as a result of the shuttering of aid programs. Overlooked amid the turmoil is the fact that many of DOGE’s cuts have been targeted at a very specific aspect of the federal government: its collection and sharing of data. In agency after agency, the government is losing its capacity to measure how American society is functioning, making it much harder for elected officials or others to gauge the nature and scale of the problems we are facing and the effectiveness of solutions being deployed against them.

The data collection efforts that have been shut down or are at risk of being curtailed are staggering in their breadth. In some cases, datasets from past years now sit orphaned, their caretakers banished and their future uncertain; in others, past data has vanished for the time being, and it’s unclear if and when it will reappear. Here are just a few examples:

The Department of Health and Human Services, now led by Robert F. Kennedy Jr., laid off the 17-person team in charge of the National Survey on Drug Use and Health, which for more than five decades has tracked trends in substance abuse and mental health disorders. The department’s Administration for Children and Families is weeks behind on the annual update of the Adoption and Foster Care Analysis and Reporting System, the nationwide database of child welfare cases, after layoffs effectively wiped out the team that compiles that information. And the department has placed on leave the team that oversees the Pregnancy Risk Assessment Monitoring System, a collection of survey responses from women before and after giving birth that has become a crucial tool in trying to address the country’s disconcertingly high rate of maternal mortality.

The Centers for Disease Control and Prevention has eviscerated divisions that oversee the WISQARS database on accidental deaths and injuries — everything from fatal shootings to poisonings to car accidents — and the team that maintains AtlasPlus, an interactive tool for tracking HIV and other sexually transmitted diseases.

The Environmental Protection Agency is planning to stop requiring oil refineries, power plants and other industrial facilities to measure and report their greenhouse-gas emissions, as they have done since 2010, making it difficult to know whether any of the policies meant to slow climate change and reduce disaster are effective. The EPA has also taken down EJScreen, a mapping tool on its website that allowed people to see how much industrial pollution occurs in their community and how that compares with other places or previous years.

The Office of Homeland Security Statistics has yet to update its monthly tallies on deportations and other indices of immigration enforcement, making it difficult to judge President Donald Trump’s triumphant claims of a crackdown; the last available numbers are from November 2024, in the final months of President Joe Biden’s tenure. (“While we have submitted reports and data files for clearance, the reporting and data file posting are delayed while they are under the new administration’s review,” Jim Scheye, director of operations and reporting in the statistics unit, told ProPublica.)

And, in a particularly concrete example of ceasing to measure, deep cutbacks at the National Weather Service are forcing it to reduce weather balloon launches, which gather a vast repository of second-by-second data on everything from temperature to humidity to atmospheric pressure in order to improve forecasting.

Looked at one way, the war on measurement has an obvious potential motivation: making it harder for critics to gauge fallout resulting from Trump administration layoffs, deregulation or other shifts in policy. In some cases, the data now being jettisoned is geared around concepts or presumptions that the administration fundamentally rejects: EJScreen, for instance, stands for “environmental justice” — the effort to ensure that communities don’t suffer disproportionately from pollution and other environmental harms. (An EPA spokesperson said the agency is “working to diligently implement President Trump’s executive orders, including the ‘Ending Radical and Wasteful Government DEI Programs and Preferencing.’” The spokesperson added: “The EPA will continue to uphold its mission to protect human health and the environment” in Trump’s second term.) The White House press office did not respond to a request for comment.

Laura Lindberg, a Rutgers public health professor, lamented the threatened pregnancy-risk data at the annual conference of the Population Association of America in Washington last week. In an interview, she said the administration’s cancellation of data collection efforts reminded her of recent actions at the state level, such as Florida’s withdrawal in 2022 from the CDC’s Youth Risk Behavior Survey after the state passed its law discouraging classroom discussion of sexual orientation. (The state’s education secretary said the survey was “inflammatory” and “sexualized.”) Discontinuing the survey made it harder to discern whether the law had adverse mental health effects among Florida teens. “States have taken on policies that would harm people and then are saying, ‘We don’t want to collect data about the impact of the policies,’” Lindbergsaid. “Burying your head in the sand is not going to be a way to keep the country healthy.” (HHS did not respond to a request for comment.)

Making the halt on data gathering more confounding, though, is the fact that, in some areas, the information at risk of being lost has been buttressing some of the administration’s own claims. For instance, Trump and Vice President JD Vance have repeatedly cited, as an argument for tougher border enforcement, the past decade’s surge in fentanyl addiction — a trend that has been definitively captured by the national drug use survey that is now imperiled. That survey’s mental health components have also undergirded research on the threat being posed to the nation’s young people by smartphones and social media, which many conservatives have taken up as a cudgel against Big Tech.

Or take education. The administration and its conservative allies have been able to argue that Democratic-led states kept schools closed too long during the pandemic because there was nationwide data — the National Assessment of Educational Progress, aka the Nation’s Report Card — that showed greater drops in student achievement in districts that stayed closed longer. But now NAEP is likely to be reduced in scope as part of crippling layoffs at the Department of Education’s National Center for Education Statistics, which has been slashed from nearly 100 employees to only three, casting into doubt the future not only of NAEP but also of a wide array of long-running longitudinal evaluations and the department’s detailed tallies of nationwide K-12 and higher education enrollment. The department did not respond to a request for comment but released a statement on Thursday saying the next round of NAEP assessments would still be held next year.

Dan Goldhaber, an education researcher at the University of Washington, cast the self- defeating nature of the administration’s war on educational assessment in blunt terms: “The irony here is that if you look at some of the statements around the Department of Education, it’s, ‘We’ve invested X billion in the department and yet achievement has fallen off a cliff.’ But the only reason we know that is because of the NAEP data collection effort!”

Shelly Burns, a mathematical statistician who worked at NCES for about 35 years before her entire team was laid off in March, made a similar point about falling student achievement. “How does the country know that? They know it because we collected it. And we didn’t spin it. We didn’t say, ‘Biden is president, so let’s make it look good,’” she said. “Their new idea about how to make education great again — how will you know if it worked if you don’t have independent data collection?”

“Reality has a well-known liberal bias,” Stephen Colbert liked to quip, and there have been plenty of liberal commentators who have, over the years, taken that drollery at face value, suggesting that the numbers all point one way in the nation’s political debates. In fact, in plenty of areas, they don’t.

It’s worth noting that Project 2025’s lengthy blueprint for the Trump administration makes no explicit recommendation to undo the government’s data-collection efforts. The blueprint is chock full of references to data-based decision-making, and in some areas, such as immigration enforcement, it urges the next administration to collect and share more data than its predecessors had.

But when an administration is making such a concerted effort to stifle assessments of government and society at large, it is hard not to conclude that it lacks confidence in the efficacy of its current national overhaul. As one dataset after another falls by the wayside, the nation’s policymakers are losing their ability to make evidence-based decisions, and the public is losing the ability to hold them accountable for their results. Even if a future administration seeks to resurrect some of the curtailed efforts, the 2025-29 hiatus will make trends harder to identify and understand.

Who knows if the country will be able to rebuild that measurement capacity in the future. For now, the loss is incalculable.


View Event →
'Crucial' climate data center shutters as federal funding expires. What's it mean for Louisiana?
Apr
17
8:00 PM20:00

'Crucial' climate data center shutters as federal funding expires. What's it mean for Louisiana?

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Photo from Chris Granger, The Times-Picayune

By KASEY BUBNASH and JOSIE ABUGOV | Staff writers

The center that provides near real-time climate data to government agencies and private companies across the South was shut down Thursday after its base federal funding expired. 

According to a statement posted to its now defunct website, the Southern Regional Climate Center is one of four such centers across the U.S. that were abruptly shuttered after lapses in funding from the U.S. Department of Commerce through the National Oceanic and Atmospheric Administration. 

"Unfortunately, all data and services offered under the base contract, including this website, will be unavailable unless and until funding is resumed," the statement reads. "Monitor this website during the upcoming days, weeks, and months for information on replacement resources that may be offered with alternative funding sources."

As of 3 p.m. Thursday, only the regional climate centers encompassing the northeast and western U.S. were still up and running. Alerts on their homepages, however, warned that "support for this website may be unavailable starting June 17, 2025."

Alison Tarter, a research specialist with the Southern Climate Research Center, said the center's five-year contract is funded by grants that have to be approved by the federal government each year. This year's deadline came and went without approval, she said. 

For now, the center, based in College Station, Texas, is not providing services, its website is down and its handful of staff members are finding other work, Tarter said. That could change, she said, if the center's funding is approved. 

“We really honestly, on our end, don’t know much," she said. "We’re just kind of dead in the water without something happening in the federal level.”

'Crucial' data

The closures come amid roller-coaster threats to cut funding and staff to much of the federal government, including NOAA, the country's leading climate and weather agency. 

Scott Smullen, a spokesperson for NOAA, declined to comment on the situation, citing a "long-standing practice" of not discussing internal personnel and management matters. 

NOAA's Regional Climate Center Program was first established in 1983 in an effort to expand access to climate information, making data collected from various sources uniform and easy to use. Today, climate centers in six U.S. regions — the High Plains, Midwest, Northeast, Southeast, South and West — offer publicly accessible and custom data sets across a wide range of sectors, including wildlife and fisheries departments, farmers and ranchers, construction companies, transportation departments, climatologists and meteorologists. 

"Their data is sort of crucial to our monitoring mission," Louisiana State Climatologist Jay Grymes said. 

The Southern Regional Climate Center is operated by the Texas A&M University System and encompasses Louisiana, Mississippi, Arkansas, Tennessee, Texas and Oklahoma. It collects data from sources across the region and then compiles it all into one, easily accessible format in real time. 

Alexa Trischler, a meteorologist at WWL-Louisiana, said the regional center is a "valuable resource" and that losing it could harm the field.

"Looking at past climate data often helps with putting better long-term forecasts together in the future to keep people safe and ready for what's next," Trischler said. "It's always a huge benefit when you're able to have this data at your fingertips to make projects about the future, and to have this go away is disheartening." 

'Unacceptable'

Following the website shutdown, a range of broadcast and private sector meteorologists took to social media, calling the move "unacceptable" and "a disaster" while stressing the importance of the data. 

The New Orleans Office of Homeland Security and Emergency Preparedness said the data is important for drought monitoring, conducting "deep dives" into climate data and gauging climate norms based on historical precedents, among other purposes.

But it is used for purposes far beyond forecasting, Grymes noted. A construction company might use it to prove they couldn't complete a project on deadline due to several days of rain. A farmer might use it to find the right time to plant or harvest crops. Grymes, as a climatologist, uses it to quickly analyze and monitor the state's weather patterns. 

“The reality is you would be hard pressed to come up with any industry that’s not impacted by weather," he said. 

Grymes said regional climate center data is also “the backbone” for some National Weather Service products. 

“So killing this program — it’s really hard at this point to evaluate just how much of a problem this is going to be, not only for offices like mine but also people all across the country, including the weather service," Grymes said. 

Grymes said it's unclear whether the Southern Regional Climate Center's data would still be available in some other format. 

"It's important that we don't lose important weather data from any entity going forward because it could potentially negatively impact forecasting," Trischler said. 

John Neilson-Gammon, who leads the Southern Regional Climate Center, did not return a call requesting comment. 

View Event →
Why Katy Perry's celebrity spaceflight blazed a trail for climate breakdown
Apr
17
6:00 PM18:00

Why Katy Perry's celebrity spaceflight blazed a trail for climate breakdown

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Public Domain

by Steve Westlake, The Conversation

What's not to like about an all-female celebrity crew riding a rocket into space? Quite a lot, as it turns out.

Katy Perry and her companions were initially portrayed in the media as breaking down gender barriers. On their return to Earth, the team enthused about protecting the planet and blazing a trail for others. Perry even sang What a Wonderful World during the flight, and kissed the ground on exiting the spacecraft.

But the backlash was swift. Fellow celebrities piled in to highlight the "hypocrisy" of such an energy-intensive endeavor from a former Unicef climate champion. Evidence was quickly presented to dispute the pollution-free claims of the Blue Origin rocket, which is fueled by oxygen and hydrogen. (In fact, the water vapor and nitrogen oxide emissions it creates add to global heating, on top of the emissions from the program as a whole.)

But it's the negative social effects of this kind of display from celebrities (of any gender) that our research sheds light on. I'm part of a team of social scientists researching the powerful effects of politicians, business leaders and celebrities who lead by example on climate change—or don't.

Social kickback

Space tourism, and other energy-intensive activities by people in the public eye, such as using helicopters and private jets, have a much wider knock-on effect than the direct damage to the climate caused by the activity itself.

We carried out focus groups with members of the public to understand their reactions to the high-carbon behavior of leaders in politics, culture and business. We also conducted experiments and surveys to test the effects of leaders "walking the talk" on climate change. We found that observing unnecessary high-carbon behavior demotivates people and reduces the sense of collective effort that is essential for a successful societal response to climate change.

Solving climate change and other environmental crises requires fundamental changes to economies, societies and lifestyles according to climate science. Using much less energy, not just different kinds of energy, can play a big part in halting the damage. And it is the wealthiest people in the richest countries who use the most energy and set the standards and aspirations for the rest of society. That's why the Blue Origin dream (of space exploration for the unfathomably wealthy) is a nightmare for the climate because it perpetuates an unsustainable culture.

Discover the latest in science, tech, and space with over 100,000 subscribers who rely on Phys.org for daily insights. Sign up for our free newsletter and get updates on breakthroughs, innovations, and research that matter—daily or weekly.

Subscribe

Our findings reveal that when people see public figures behaving like this, they are less willing to make changes to their own lives. "Why should I do my bit for the climate when these celebrities are doing the opposite?" is the question people repeatedly asked in our research.

Many of the changes to behavior necessary to tackle climate change will require people to accept trade-offs and embrace alternative ways of living. This includes using heat pumps instead of gas boilers, trading in large, fossil-fueled vehicles (or even avoiding cars altogether) and forgoing flights—because there is no way to decarbonize long-distance flights in time.

When celebrities (or politicians and business leaders, for that matter) ignore the environmental damage of their choices, it sends a powerful signal that they are not really serious about addressing climate change.

Not only does this undermine people's motivation to make changes, it reduces the credibility of leaders. That in turn makes coordinated climate action less likely, because shifting to a low-carbon society will require public trust in leadership and a sense of collective effort.

Individual choices matter

The widespread aversion to Perry's space flight contradicts the popular argument that tackling the climate crisis "is not about individual behavior."

On the contrary, the response shows that these actions from celebrities and other leaders have much greater symbolic meaning than is captured by the idea of an "individual choice." People are highly attuned to the behavior of others because it signals and reinforces the values, morals and norms of our society. As such, few if any choices are truly "individual."

This message of collective responsibility is one our current economic and political system works hard to suppress by championing unlimited freedom to consume, while ignoring the loss of freedom that such behavior causes: freedom to live in a stable climate, freedom from pollution, freedom from extreme weather, freedom for future generations.

In fact, research reveals that most people understand the interconnectedness of society and the need for a coordinated response to the climate crisis. Climate assemblies, which convene ordinary citizens to discuss and deliberate a course of climate action, have revealed a willingness to curtail some activities in a fair way.

When it comes to preserving a livable planet and a stable climate, most people know that space tourism and ultra-high-carbon living are off the agenda. Celebrities have a positive role to play in leading by example. It's not rocket science.

Read more here

View Event →
Climate change will make rice toxic, say researchers
Apr
17
3:00 PM15:00

Climate change will make rice toxic, say researchers

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Credit: Aman Rochman/NurPhoto/Getty Images

Warmer temperatures and increased carbon dioxide will boost arsenic levels in rice.
Inside Climate News – Apr 17, 2025 6:46 AM

Rice, the world’s most consumed grain, will become increasingly toxic as the atmosphere heats and as carbon dioxide emissions rise, potentially putting billions of people at risk of cancers and other diseases, according to new research published Wednesday in The Lancet.

Eaten every day by billions of people and grown across the globe, rice is arguably the planet’s most important staple crop, with half the world’s population relying on it for the majority of its food needs, especially in developing countries.

But the way rice is grown—mostly submerged in paddies—and its highly porous texture mean it can absorb unusually high levels of arsenic, a potent carcinogenic toxin that is especially dangerous for babies.

Lewis Ziska, a plant physiologist and associate professor at Columbia University, has studied rice for three decades and has more recently focused his research on how climate change reduces nutrient levels across many staple crops, including rice. He teamed up with researchers from China and the US to conduct a first-of-its-kind study, looking at how a range of rice species reacted to increases in temperature and carbon dioxide, both of which are projected to occur as more greenhouse gas emissions are released into the atmosphere as a result of human activities. The new study was published in The Lancet Planetary Health.

“Previous work has focused on individual responses—some on CO2 and some on temperature, but not both, and not on a wide range of rice genetics,” Ziska said. “We knew that temperature by itself could increase levels, and carbon dioxide by a little bit. But when we put both of them together, then wow, that was really something we were not expecting. You’re looking at a crop staple that’s consumed by a billion people every day, and any effect on toxicity is going to have a pretty damn large effect.”

For six years, Ziska and a large team of research colleagues in China and the US grew rice in controlled fields, subjecting it to varying levels of carbon dioxide and temperature. They found that when both increased, in line with projections by climate scientists, the amount of arsenic and inorganic arsenic in rice grains also went up.

Arsenic is found naturally in some foods, including fish and shellfish, and in waters and soils.

Inorganic arsenic is found in industrial materials and gets into water—including water used to submerge rice paddies.

Rice is easily inundated with weeds and other crops, but it has one advantage: It grows well in water. So farmers germinate the seeds, and when the seedlings are ready, plant them in wet soil. They then flood their fields, which suppresses weeds, but allows the rice to flourish. Rice readily absorbs the water and everything in it—including arsenic, either naturally occurring or not. Most of the world’s rice is grown this way.

The new research demonstrates that climate change will ramp up those levels.

“What happens in rice, because of complex biogeochemical processes in the soil, when temperatures and CO2 go up, inorganic arsenic also does,” Ziska said. “And it’s this inorganic arsenic that poses the greatest health risk.”

Exposure to inorganic arsenic has been linked to cancers of the skin, bladder, and lung, heart disease, and neurological problems in infants. Research has found that in parts of the world with high consumption of rice, inorganic arsenic increases cancer risk.

Ziska and his colleagues took the data from their field trials and then, based on per capita consumption data in seven of the top rice-consuming countries in Asia, projected how disease risk could also increase. They found that in those seven countries—Vietnam, Indonesia, China, Bangladesh, the Philippines, Myanmar and India—disease risk rose across the board.

“There is a toxicological effect of climate change relative to one of the most consumed staples in the world,” Ziska said, “and the consumption is one of the hallmarks of whether you’re going to be vulnerable to that effect.”

Researchers have known that rice can contain high levels of arsenic, and regulators have suggested exposure limits, especially for infants, who are particularly vulnerable and tend to eat a lot of rice. This new research should put extra pressure on regulators to set more stringent thresholds, the authors say. The US Food and Drug Administration has never set limits for arsenic in foods.

The researchers also point to the potential of various interventions that could limit exposure to inorganic arsenic from rice, including developing strains of rice that are less absorbent and educating consumers about alternatives to rice.

“Rice has always been a food where arsenic is an issue, and climate change is making it worse,” said Keeve Nachman, one of the report’s authors, a professor at Johns Hopkins University and a longtime researcher of health risks related to food production and consumption. “This is one more reason to intervene—to control people’s exposure. The No. 1 thing we can do is everything in our power to slow climate change.”

This story originally appeared on Inside Climate News.

View Event →
Appeals court temporarily halts disbursement of contested climate funds
Apr
17
1:00 PM13:00

Appeals court temporarily halts disbursement of contested climate funds

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Pablo Martinez Monsivais, Associated Press file

The Environmental Protection Agency Building is shown in Washington on Sept. 21, 2017.

by Rachel Frazin - 04/17/25 5:09 PM ET

An appeals court has temporarily halted a lower court’s order that enabled the release of contested climate funds.

Earlier this week, District Judge Tanya Chutkan blocked the Environmental Protection Agency (EPA) from clawing back billions of dollars in climate funds that were given to climate finance organizations during the Biden administration.

Her order directed Citibank to release the funds to the green bank groups as soon as Thursday.

However, late Wednesday an appeals court issued a different ruling that prevented the funds from being released and instead maintained the status quo.

A panel of appeals court judges ordered that the funds should neither be returned to the U.S. Treasury Department nor released to the climate organizations so that the panel would have adequate time to consider the case. 

The funds in question were part of a $20 billion program from the Democrats’ Inflation Reduction Act that gave nonprofits money to use to fund climate-friendly projects.

The Biden administration awarded that $20 billion to eight institutions. Since taking office, the Trump administration has tried to recoup the money.

When Chutkan ordered the funds released, the EPA appealed. It said in a statement at the time that the grants “are terminated, and the funds belong to the U.S. taxpayer. We couldn’t be more confident in the merits of our appeal and will take every possible step to protect hard-earned taxpayer dollars.”

The agency declined to comment on the latest order.

Meanwhile, Beth Bafford, CEO of Climate United, which was one of the grant recipients, said in a written statement, “We remain firm on the merits of our case and will press forward to deliver on our promises to communities across America.”

Read more here

View Event →
Carbon removal startup Holocene bought by oil and gas giant Occidental
Apr
17
1:00 PM13:00

Carbon removal startup Holocene bought by oil and gas giant Occidental

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Image Credits: Brandon Bell / Getty Images

Tim De Chant / 1:05 PM PDT · April 17, 2025

Occidental has bought Holocene, marking the second direct air capture startup the fossil fuel company has bought in two years.

The deal was executed through Oxy Low Carbon Ventures, a subsidiary of the oil and gas company, for an undisclosed amount. HeatMap first reported the news.

Holocene had been racing to advance its amino acid-based carbon removal technology following a $10 million deal it signed in September with Google to deliver 100,000 metric tons of carbon removal by the early 2030s.

At $100 per metric ton, the price was significantly lower than what competitors could offer today. Currently, removing carbon dioxide directly from the atmosphere is estimated to cost around $600 per metric ton.

Occidental’s interest in carbon capture stems from a technique known as enhanced oil recovery, in which CO2 is injected underground to stimulate oil wells. The company bought another direct air capture startup, Carbon Engineering, in 2023 for $1.1 billion.

An Occidental spokesperson told HeatMap that the company will be using Holocene’s technology to further its direct air capture research and development.

Direct air capture qualifies for tax credits under the Inflation Reduction Act, with the final incentive dependent on whether the equipment uses zero-emission power and if the captured carbon dioxide is used for enhanced oil recovery. 

Read more here

View Event →
Forecast for weaker weather service: Americans will die, businesses will lose billions
Apr
14
12:30 PM12:30

Forecast for weaker weather service: Americans will die, businesses will lose billions

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Members of NOAA's National Severe Storms Laboratory monitor a thunderstorm in Kansas, 2009. (Photo: Dr. Mike Coniglio, NOAA NSSL)

By Toby Ault, Daniele Visioni, Peter Hitchcock | March 14, 2025

An invisible river of information flows through our daily lives, powering American commerce and keeping all of us safe in our homes, offices, and on our roadways. Its keepers are the dutiful public servants at the National Oceanic and Atmospheric Administration (NOAA) and the National Weather Service (NWS). The recent elimination of over 800 positions, with another 1,000 planned, will not only threaten lives and diminish US leadership in weather prediction—it will invariably disrupt countless industries, from finance to agriculture to reinsurance.

Like haphazardly dismantling sections of our interstate highway system, these cuts create dangerous gaps in our national capacity that the private sector cannot fill. Every time an airline routes around turbulence, an insurance company prices a policy, or a farmer plans their planting season, they rely on a sophisticated network of 620 facilities nationwide that includes 100 upper-air monitoring sites and crucial satellite operations centers as well as advanced numerical models of weather and the supercomputers required to run them. This infrastructure, supporting more than one-third of US GDP, requires sustained investment in both infrastructure and highly trained personnel with advanced degrees.

Businesses and lawmakers must step up and stop the hemorrhaging of NOAA and NWS data products and personnel before it’s too late. If saving money or improving efficiency is the goal of DOGE’s activities, then the economic case for protecting NOAA and NWS is clear: Their activities support fully one-third of US GDP, making these services essential to private sector success. In terms of return on investment, every US dollar spent on weather services yields $73 in documented returns.

Some might suggest that artificial intelligence and machine learning could fill the gap left by these cuts. Indeed, companies like Google DeepMind, Huawei, and Nvidia have made impressive advances in AI-based weather prediction, but these tools can only amplify, not replace, NOAA and NWS expertise. They rely entirely on the infrastructure we’re now dismantling: decades of climate data gathered by NOAA satellites, weather balloons, and radar systems, all interpreted through traditional physics-based models. The current cuts directly impair NOAA’s ability to collect new data, with weather balloon launches already suspended in multiple locations. Without real-time input from weather balloons, remote sensing, and in-situ measurements, no amount of machine learning can improve forecasts. If the expertise is lost and infrastructure is dismantled, all forecasts will be degraded––“garbage in, garbage out.”

Not only are machine learning and AI insufficient on their own to replace NOAA and NWS personnel, but the haphazard and careless way the firings have unfolded means that many early-career scientists who are experts in these fields have recently lost their jobs. We are keenly aware of the unique expertise that these extraordinarily brilliant, talented, and hardworking individuals bring to the US government, many of them having been our former students. These individuals will inevitably find opportunities in the private sector or in other countries. And that is precisely our point: Losing talent and capacity in the AI and machine learning space will weaken the US government as a whole and make it much less efficient overall.

Conservatives who support the cuts and firings might be tempted to invoke Reagan’s “Starve the Beast” theory of government, with the notion that pushing talented people into the private sector would make American business more competitive globally. Yet this misrepresents the nature of climate and weather data as well as Reagan’s actual approach. Reagan himself demonstrated that conservative leadership can embrace both scientific evidence and national security when he signed the Montreal Protocol to protect the ozone layer—a decision that protected both American interests and the global environment. Even during the height of 1980s privatization, the Reagan administration recognized that essential public infrastructure—from interstate highways to satellite communications systems—was a prerequisite for private sector success. Indeed, today’s private space companies owe their existence to those early federal investments in space infrastructure.

The critical infrastructure provided by NOAA and NWS cannot and will never be duplicated by the private sector. Companies do, however, build upon public data and federal expertise to create value-added products. Destroying this infrastructure will make weather and climate data less reliable and more costly: Insurance companies will have to hedge against greater uncertainty, farmers and growers will lose access to free NWS predictions, and transportation networks will face increased risks. These changes will drive up prices across the economy and hurt American competitiveness in the global marketplace. Moreover, we will cede leadership in climate and weather forecasting to other centers, like the European Centre for Medium-Range Weather Forecasts.

Put bluntly: Americans will die and American businesses will lose untold billions if we do not protect NOAA and NWS. The private sector can’t replace the expansive networks of observations and modeling carried out by these organizations, nor can it replace the years of education and training required to sustain a competitive, competent, and scientifically advanced workforce. Machine learning and AI cannot save US businesses from the devastating impacts of losing our weather infrastructure. The loss of specialized personnel—from tsunami warning scientists to hurricane hunters—creates vulnerabilities that will ripple through our economy, increasing costs and risks across every sector that depends on reliable information about climate and weather.

Cornell University professors Flavio Lehner, Angeline Pendergrass, and Jonathan Lin also contributed to this piece.

Read more here

View Event →
Fungal infections are ‘taking over the world’. Can they be stopped?
Apr
6
10:30 AM10:30

Fungal infections are ‘taking over the world’. Can they be stopped?

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Surgeons at the Seven Star hospital in Nagpur, India, operating on a patient with mucormycosis ‘black fungus’ Credit: Simon Townsley

Sophie O’Sullivan | 07 April 2025 6:00am BST

Blood congealed “like black sausage”, sexually-transmitted athlete’s foot, and bloodstream-born pathogens untreatable with existing drugs. These are the kinds of fungal infections Professor Darius Armstrong-James, Infectious Diseases and Medical Mycology at Imperial College London, is used to treating.

“Probably about a third of the world is infected by some kind of fungus,” says Prof Armstrong-James, “mostly skin, mucocutaneous, vaginal candidiasis, athlete’s foot. Those kinds of fungi that aren’t deadly but they are increasing in resistance”.

More lethal fungal varieties are spreading too: invasive fungal infections are killing an estimated 2.5 million people each year – twice the global fatalities of tuberculosis.

The world remains critically underprepared for fungal infections, the World Health Organization (WHO) warned this week, with a lack of diagnostic tests, effective treatments, and surveillance creating an urgent need for research.

But how serious is the problem? 

The WHO’s fungal priority pathogens list, compiled in response to this rising public health threat, is an itch-inducing read.

‘Critical priority’ fungi with mortality rates of up to 88 per cent take the top spots.

“Black fungus” or Mucormycosis, which turns tissue into black lesions, made headlines during the Covid-19 pandemic when 51,000 cases were reported in India.

“It invades very often through the nose, and then it can get into the eyes […] down the optic nerve into the brainstem and kills you,” Prof Armstrong-James told the Telegraph.

“We have to give [patients] all the strongest drugs we can find…cut out all of the infected tissue which often means major surgery to the face and half their brain”.

For some patients, the amount of blackened tissue that needs removing is so extreme it’s impossible and they die within days.

Yet Mucorales, the fungal family which causes Mucormycosis, is not one considered ‘critical’ by WHO ranking.

There are four invasive fungal pathogens deemed ‘critical’ on the list, and their insidious spores can even be found in the UK.

One of them, “Candida albicans”, can be found “in about half the population inside our guts,” says Dr Rebecca Drummond, Associate Professor in antifungal immunity, University of Birmingham.

Aspergillus, another critical priority pathogen, is in fact so widespread that most people inhale between 100 to 1000 spores every day from the air we breathe.

Even the mould on bread can contain Mucor, a fungus that causes Mucormycosis.

Read more here

View Event →
Our Approach to Climate Policy Has Failed. It’s Time for Climate Realism
Apr
6
10:30 AM10:30

Our Approach to Climate Policy Has Failed. It’s Time for Climate Realism

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

The skyline of New York City’s lower Manhattan is reflected in the Hudson River. New York City is one of several American cities that will face severe flooding this century. Gary Hershorn/Getty Images

U.S. policymakers need a new strategy to confront the risks of climate change, compete in the global energy transition, and stay the course regardless of which political party is in power. A doctrine of “climate realism” could earn bipartisan support by decisively pursuing American interests.

Article by Varun Sivaram
April 7, 2025 12:06 am (EST)

The U.S. response to climate change represents a profound foreign policy failure. A convenient excuse for dismal results is that only administrations and lawmakers on one side of the aisle have prioritized climate, enacting policies that are promptly reversed when political power switches hands. Every four years, the United States joins or exits the Paris Agreement and lavishly spends billions of dollars on clean energy subsidies or claws them back.  

Yet, the very fact that climate has no political staying power is an indictment of the policy approach presented to U.S. voters, not a valid excuse for why those policies inevitably fail. While Washington dithers, foreign climate-warming emissions—the vast majority of the world’s total—threaten the American homeland with ever-worsening disasters. And decades of poorly targeted domestic subsidies have failed to make U.S. clean technologies competitive with those of China’s.

The United States needs a new doctrine for its approach to climate change, one that rises above today’s partisan disagreements, pragmatically advances U.S. interests, and aligns with the priorities of American voters. Climate realism draws inspiration from sound arguments that have bipartisan appeal while jettisoning misguided proposals championed by partisans on the left and the right.

Debunking Four Fallacies

The climate realism doctrine is both realist and realistic. It is realist in that it prioritizes advancing U.S. interests and recognizes that other countries will single-mindedly prioritize their own interests. And it is realistic by dispensing with four fallacies that too often muddle policy thinking on climate.

1. The world’s climate targets are achievable. They are not. The 2015 Paris Agreement’s internationally agreed target of limiting the global average temperature to “well below” 2°C (3.6°F) by century-end will almost certainly be breached, given that global greenhouse gas emissions continue to rise. Similarly, the target of net-zero emissions by 2050 is utterly implausible. The world is likely on track to warm on average by 3°C (5.4°F) or more this century. 

To be sure, clean energy has made remarkable progress. Solar power is now the cheapest and fastest-growing power source on the planet. But the roughly $10 trillion in annual investments to fundamentally overhaul the global economy and infrastructure base is more than voters and governments around the world are willing to shoulder. And the innovations that would enable deep decarbonization of high-emissions sectors, such as heavy industries and long-distance transportation, remain far from commercialization. The preponderance of available data signals that the global economy will fail to reach net-zero emissions in the twenty-first century.

2. Reducing U.S. domestic greenhouse gas emissions can make a meaningful difference. U.S. domestic emissions will be largely irrelevant to global climate change. The trajectory of climate change in the twenty-first century will depend on future global cumulative emissions between 2025 and 2100. By that measure, the United States is on track to account for around 5 percent of global future cumulative emissions. China—as well as emerging and non-advanced economies including India, Indonesia, Brazil, and South Africa—will account for more than 85 percent of that total. Slowing climate change principally depends on reducing emissions outside of U.S. borders.

Many argue, however, that reducing U.S. emissions demonstrates international leadership, or signaling, that can persuade other countries to reduce their emissions. This is wrong as well. Unilateral U.S. emissions reductions do not change the fundamental calculus of other countries when it comes to decarbonizing their own economies. The evidence was clearest when the United States passed its Inflation Reduction Act, an expensive, $1.2 trillion subsidy package that would reduce future U.S. emissions. Countries across Asia, Europe, and more cried foul, far more outraged at the law’s effect on economic trade and support for U.S. domestic manufacturing than encouraged to reduce their own emissions. 

3. Climate change poses a manageable risk to U.S. economic prosperity and national security. This is wishful thinking. The so-called “tail risks” from runaway climate change are both cataclysmic and too plausible to be ignored. Unfortunately, too much attention is paid to economists’ central estimates of climate damages, rather than to the tails. For example, the Congressional Budget Office’s central estimate is that by 2100, the United States will lose 6 percent of gross domestic (GDP) compared to a scenario with no climate change. This trivial loss of GDP might relegate climate change to a third-tier risk, well below a global pandemic or nuclear war.

Yet, these figures might underestimate climate’s impact by an order of magnitude or more. The risks of seven-foot sea-level rise, dramatically intensified hurricanes, wildfires, and hailstorms, and entire U.S. cities being wiped off the map this century are nontrivial. On a relative basis, the United States might emerge better off than other countries that are even harder hit. But damage on this scale could endanger the survival of American society as we know it.

4. The clean energy transition is necessarily a win-win for U.S. interests and climate action. In reality, the unfolding energy transition carries serious risks as well as potential opportunities for U.S. interests. The United States is the world’s largest oil and gas producer and one of its largest exporters, a position that brings U.S. energy security, economic prosperity, and global geopolitical leverage. However, China has emerged as by far the dominant producer of clean energy technologies, spanning solar panels, wind turbines, batteries, and electric vehicles. On its current course, a global transition to clean energy would degrade U.S. economic and security interests while advancing China’s. The only way to align U.S. interests with a clean energy transition is for the United States to develop innovative, globally competitive clean technology industries.

U.S. policymakers in both major parties have too often succumbed to one or more of these fallacies. Discarding them is the first step toward a clear-eyed and constructive agenda.

Read more here

View Event →
Antarctica’s melting ice sheets may trigger massive volcanic activity
Apr
4
10:30 AM10:30

Antarctica’s melting ice sheets may trigger massive volcanic activity

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Melting ice sheets in West Antarctica could trigger powerful volcanic eruptions, creating a dangerous feedback loop that accelerates sea-level rise. (CREDIT: George Steinmetz / CORBIS)

Melting Antarctic ice sheets may trigger volcanic eruptions, accelerating global sea-level rise through a dangerous feedback cycle.

Joseph Shavit
Published Apr 5, 2025 1:07 PM PDT

When volcanic eruptions make headlines, the images often depict fiery lava and towering ash clouds. But beneath Antarctica’s frozen landscape, volcanoes quietly shape Earth's climate in surprising ways. Recent scientific studies reveal that melting ice sheets in West Antarctica might trigger volcanic activity, creating a cycle that speeds up ice loss and sea-level rise.

Scientists studying Earth's geological past have found that volcanoes covered by ice sheets react strongly when the ice melts. As thick ice disappears, it removes a heavy weight from the surface. The land underneath then lifts slightly, easing pressure on magma chambers hidden deep within the Earth. This process, called isostatic rebound, can push magma upward, causing eruptions that further melt the ice above.

Ice Sheets, Volcanoes, and Climate

West Antarctica, home to one of Earth's largest ice sheets, sits atop a volcanic hotspot known as the West Antarctic Rift. This region contains over 100 volcanic centers—many hidden beneath ice layers thousands of meters thick. The ice not only hides these volcanoes but also stabilizes them. Its massive weight holds magma chambers under control, preventing frequent eruptions. But when climate change thins the ice, this balance is disrupted.

Schematic of the thermomechanical magma chamber model with simulated ice unloading from this study. Transparent arrows represent ice unloading as a decrease in the ice layer thickness over time. (CREDIT: Geochemistry, Geophysics, Geosystems)

Researchers recently used computer simulations to study how shrinking ice sheets impact these hidden volcanic systems. They discovered that the rate at which ice disappears greatly affects volcanic behavior.

Faster melting reduces pressure quickly, allowing magma chambers to expand and push magma upward. This increased volcanic activity melts even more ice, creating a dangerous feedback loop that could speed up global sea-level rise.

Dr. Allie Coonin, a researcher at Brown University who led the study, explains the process clearly: "As the ice melts away, the reduced weight on the volcano allows the magma to expand. It applies pressure upon the surrounding rock that may facilitate eruptions."

The consequences of this interaction are significant. When magma chambers deep beneath ice sheets expand, dissolved gases—mostly carbon dioxide and water—begin forming bubbles. These bubbles increase pressure within the magma, making eruptions more likely and potentially more intense. In essence, the melting ice sheet opens the door for explosive volcanic activity.

Lessons from the Andes

To understand how glaciers influence volcanoes, researchers also looked at volcanic records from the Andes mountains in South America. Around 18,000 years ago, large ice sheets covered volcanoes in Patagonia. As Earth's climate warmed naturally, ice sheets melted rapidly, triggering a series of volcanic eruptions. The timing of these eruptions strongly matches periods when ice was retreating fastest.

This historic pattern confirms the researchers' models: melting glaciers can directly lead to increased volcanic eruptions. According to Coonin, volcanic systems react quickly once pressure is reduced. "We found that the removal of an ice sheet results in larger eruptions," she says. These bigger eruptions release more heat, accelerating ice melt even further.

Today, Antarctica is experiencing conditions similar to Patagonia’s past. Satellite measurements show ice thinning rates up to 3 meters per year in certain West Antarctic areas, a rate scientists consider alarmingly fast. If current melting continues—or accelerates—it could trigger substantial volcanic activity beneath the ice.

Read more here

View Event →
Invisible losses: thousands of plant species are missing from places they could thrive – and humans are the reason
Apr
2
10:30 AM10:30

Invisible losses: thousands of plant species are missing from places they could thrive – and humans are the reason

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Semi-natural pastures preserve many different plant species. Pictured: the Hulunbuir grasslands in Inner Mongolia, China. Dashu Xinganling/Shutterstock

April 2, 2025 3:04pm EDT
Cornelia Sattler & Julian Schrader

If you go walking in the wild, you might expect that what you’re seeing is natural. All around you are trees, shrubs and grasses growing in their natural habitat.

But there’s something here that doesn’t add up. Across the world, there are large areas of habitat which would suit native plant species just fine. But very often, they’re simply absent.

Our new research gauges the scale of this problem, known as “dark diversity”. Our international team of 200 scientists examined plant species in thousands of sites worldwide.

What we found was startling. In regions heavily affected by our activities, only about 20% of native plant species able to live there were actually present. But even in areas with very little human interference, ecosystems only contained about 33% of viable plant species.

Why so few species in wilder areas? Our impact. Pollution can spread far from the original source, while conversion of habitat to farms, logging and human-caused fires have ripple effects too.

Conspicuous by their absence

Our activities have become a planet-shaping force, from changing the climate through our emissions to farming 44% of all habitable land. As our footprint has expanded, other species have been pushed to extinction. The rates of species loss are unprecedented in recorded history.

When we think about biodiversity loss, we might think of a once-common animal species losing numbers and range as farms, cities and feral predators expand. But we are also losing species from within protected areas and national parks.

To date, the accelerating loss of species has been largely observed at large scale, such as states or even whole countries. Almost 600 plant species have gone extinct since 1750 – and this is likely a major underestimate. Extinction hotspots include Hawaii (79 species) and South Africa’s unique fynbos scrublands (37 species).

But tracking the fate of our species has been difficult to do at a local scale, such as within a national park or nature reserve.

Similarly, when scientists do traditional biodiversity surveys, we count the species previously recorded in an area and look for changes. But we haven’t tended to consider the species that could grow there – but don’t.

Many plants have been declining so rapidly they are now threatened with extinction.

What did we do?

To get a better gauge of biodiversity losses at smaller scale, we worked alongside scientists from the international research network DarkDivNet to examine almost 5,500 sites across 119 regions worldwide. This huge body of fieldwork took years and required navigating global challenges such as COVID-19 and political and economic instability.

At each 100 square metre site, our team sampled all plant species present against the species found in the surrounding region. We defined regions as areas of approximately 300 square kilometres with similar environmental conditions.

Just because a species can grow somewhere doesn’t mean it would. To make sure we were recording which species were genuinely missing, we looked at how often each absent species was found growing alongside the species growing at our chosen sites at other sampled sites in the region. This helped us detect species well-suited to a habitat but missing from it.

We then cross-matched data on these missing species against how big the local human impact was by using the Human Footprint Index, which measures population density, land use and infrastructure.

Of the eight components of this index, six had a clear influence on how many plant species were missing: human population density, electric infrastructure, railways, roads, built environments and croplands. Another component, navigable waterways, did not have a clear influence.

Interestingly, the final component – pastures kept by graziers – was not linked to fewer plant species. This could be because semi-natural grasslands are used as pasture in areas such as Central Asia, Africa’s Sahel region and Argentina. Here, long-term moderate human influence can actually maintain highly diverse and well-functioning ecosystems through practices such as grazing livestock, cultural burning and hay making.

Overall, though, the link between greater human presence and fewer plant species was very clear. Seemingly pristine ecosystems hundreds of kilometres from direct disturbance had been affected.

These effects can come from many causes. For instance, poaching and logging often take place far from human settlements. Poaching an animal species might mean a plant species loses a key pollinator or way to disperse its seeds in the animal’s dung. Over time, disruptions to the web of relationships in the natural world can erode ecosystems and result in fewer plant species. Poachers and illegal loggers also cut “ghost roads” into pristine areas.

Other causes include fires started by humans, which can threaten national parks and other safe havens. Pollution can travel and settle hundreds of kilometres from its source, affecting ecosystems.

Our far-reaching influence can also hinder the return of plant species, even in protected areas. As humans expand their activities, they often carve up natural areas into fragments cut off from each other. This can isolate plant populations. Similarly, the loss of seed-spreading animals can stop plants from recolonising former habitat.

What does this mean?

Biodiversity loss is not just about species going extinct. It’s about ecosystems quietly losing their richness, resilience and functions.

Protecting land is not enough. The damage we can do can reach deep into conservation areas.

Was there good news? Yes. In regions where at least a third of the landscape had minimal human disturbance, there was less of this hidden biodiversity loss.

As we work to conserve nature, our work points to a need not just to preserve what’s left but to bring back what’s missing. Now we know what species are missing in an area but still present regionally, we can begin that work.

Read more here

View Event →
Scientists shielding farming from climate change need more public funding. But they’re getting less
Mar
31
10:00 AM10:00

Scientists shielding farming from climate change need more public funding. But they’re getting less

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

AP Photo/Joshua A. Bickel

By  MELINA WALLING
Updated 6:38 AM PDT, March 31, 2025

Erin McGuire spent years cultivating fruits and vegetables like onions, peppers and tomatoes as a scientist and later director of a lab at the University of California-Davis. She collaborated with hundreds of people to breed drought-resistant varieties, develop new ways to cool fresh produce and find ways to make more money for small farmers at home and overseas.

Then the funding stopped. Her lab, and by extension many of its overseas partners, were backed financially by the United States Agency for International Development, which Trump’s administration has been dismantling for the past several weeks. Just before it was time to collect data that had been two years in the making, her team received a stop work order. She had to lay off her whole team. Soon she was laid off, too.

“It’s really just been devastating,” she said. “I don’t know how you come back from this.”

The U.S. needs more publicly funded research and development on agriculture to offset the effects of climate change, according to a paper out in Proceedings of the National Academy of Sciences this month. But instead the U.S. has been investing less. United States Department of Agriculture data shows that as of 2019, the U.S. spent about a third less on agricultural research than its peak in 2002, a difference of about $2 billion. The recent pauses and freezes to funding for research on climate change and international development are only adding to the drop. It’s a serious issue for farmers who depend on new innovations to keep their businesses afloat, the next generation of scientists and eventually for consumers who buy food.

Read more here

View Event →
Rice Scientists Pioneer Method to Tackle ‘Forever Chemicals’
Mar
31
10:00 AM10:00

Rice Scientists Pioneer Method to Tackle ‘Forever Chemicals’

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Rice University researchers have developed an innovative solution to a pressing environmental challenge: removing and destroying per- and polyfluoroalkyl substances (PFAS), commonly called “forever chemicals.” A study led by James Tour, the T.T. and W.F. Chao Professor of Chemistry and professor of materials science and nanoengineering, and graduate student Phelecia Scotland unveils a method that not only eliminates PFAS from water systems but also transforms waste into high-value graphene, offering a cost-effective and sustainable approach to environmental remediation. This research was published March 31 in Nature Water.

PFAS are synthetic compounds in various consumer products, valued for their heat, water and oil resistance. However, their chemical stability has made them persistent in the environment, contaminating water supplies and posing significant health risks, including cancer and immune system disruptions. Traditional methods of PFAS disposal are costly, energy-intensive and often generate secondary pollutants, prompting the need for innovative solutions that are more efficient and environmentally friendly.

“Our method doesn’t just destroy these hazardous chemicals; it turns waste into something of value,” Tour said. “By upcycling the spent carbon into graphene, we’ve created a process that’s not only environmentally beneficial but also economically viable, helping to offset the costs of remediation.”

Read More: Rice University

View Event →
Trump administration freezes climate-smart forestry funding in Maine
Mar
31
10:00 AM10:00

Trump administration freezes climate-smart forestry funding in Maine

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Lumber mill (WGME)

by Allyson Lapierre, WGME
Mon, March 31st 2025 at 6:02 AM

A renewable energy project to protect Maine forests has been put on pause. Millions of federal dollars initially promised to Maine woodland owners are being blocked by the Trump administration.

Six Maine commercial woodland owners were chosen to lead a 12,000-acre climate project to enhance carbon storage in the state.

The federal government issued a $32 million grant to fund the project last year. But that funding has been frozen.

“What this is going to do is remove a lot of opportunities for diverse landowners in the state,” said Brian Milakovsky, with the New England Forestry Foundation.

The grant was promised under the Biden administration but has been blocked and put under review by the Trump administration, leaving landowners, loggers, and others in jeopardy.

“Continue to be a challenge and a constant transformation for forest landowners over the coming decades, and the competitive edge is really going to be growing high-quality sawtimber of the kind that you can sell to sawmills, veneer plants to make into long lived wood products, flooring, furniture,” Milakovsky said.

Patty Cormier with the Maine Forest Service says the pause of funding is not only impacting the future of the forestry sector, but also the state's ecosystem.

“The water, the air we breathe, the wildlife so it has wide ranging impacts. It’s just unfortunate,” Cormier said.

A part of the grant is budgeted to reimbursing Maine companies for forestry work. And without knowing if the funding is coming through, many jobs are on the line.

“We could see some disruptions to the contractors that work with some of our landowners due to the fact that these federal funds are committed, which we have been planning with landowners to you have just been frozen without warning,” Milakovsky said.

Forest agencies are optimistic that these funds will eventually come through.

“We are seeing some of the grants at the forest service open, so I'm hoping its just the federal government to go through all these grants,” Cormier said.

In the meantime, agencies are asking lawmakers for help to unfreeze the funds.

Read more here

View Event →
Martian dust could pose health risks to future astronauts
Mar
31
10:00 AM10:00

Martian dust could pose health risks to future astronauts

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

NASA's Curiosity rover reveals the dusty landscape of Mars in this selfie. (Credit: NASA/JPL-Caltech/MSSS)

Published: 3/31/2025
By Daniel Strain

Don’t breathe in the dust on Mars.

That’s the takeaway from new research from a team of scientists, including researchers from the University of Colorado Boulder. The findings suggests that long-term exposure to Martian dust could create a host of health problems for future astronauts—leading to chronic respiratory problems, thyroid disease and more.

The study, published in the journal GeoHealth, is the first to take a comprehensive look at the chemical ingredients that make up Martian dust, and their possible impacts on human health. It was undertaken by a team from the worlds of medicine, geology and aerospace engineering.

“This isn't the most dangerous part about going to Mars,” said Justin Wang, lead author of the study and a student in the Keck School of Medicine at the University of Southern California in Los Angeles. “But dust is a solvable problem, and it’s worth putting in the effort to develop Mars-focused technologies for preventing these health problems in the first place.”

Wang, a CU Boulder alumnus, noted that Apollo era astronauts experienced runny eyes and irritated throats after inhaling dust from the moon. Apollo 17’s Harrison Schmitt likened the symptoms to hay fever.

But scientists know a lot less about the potential harms of Martian dust. To begin to answer that question, Wang and his colleagues drew on data from rovers on Mars and even Martian meteorites to better understand what makes up the planet’s dust. The group discovered a “laundry list” of chemical compounds that could be dangerous for people—at least when inhaled in large quantities and over long periods of time.

They include minerals rich in silicates and iron oxides, metals like beryllium and arsenic and a particularly nasty class of compounds called perchlorates.

In many cases, those ingredients are present in only trace amounts in Mars dust. But the first human explorers on Mars may spend around a year and a half on the surface, increasing their exposure, said study co-author Brian Hynek.

“You’re going to get dust on your spacesuits, and you’re going to have to deal with regular dust storms,” said Hynek, a geologist at the Laboratory for Atmospheric and Space Physics (LASP) at CU Boulder. “We really need to characterize this dust so that we know what the hazards are.”

Into the bloodstream

One thing is clear, he added: Mars is a dusty place.

Much of the planet is covered in a thick layer of dust rich in tiny particles of iron, which gives the planet its famous red color. Swirling dust storms are common and, in some cases, can engulf the entire globe.

“We think there could be 10 meters of dust sitting on top of the bigger volcanoes,” said Hynek, a professor in the Department of Geological Sciences. “If you tried to land a spacecraft there, you’re going to just sink into the dust.”

Wang found his own way to Martian dust through a unique academic path. He started medical school after earning bachelor’s degrees from CU Boulder in astronomy and molecular, cellular and developmental biology, followed by a master’s degree in aerospace engineering sciences. He currently serves in the Navy through its Health Professions Scholarship Program.

He noted that the biggest problem with Martian dust comes down to its size. Estimates suggest that the average size of dust grains on Mars may be as little as 3 micrometers across, or roughly one-ten-thousandth of an inch.

“That’s smaller than what the mucus in our lungs can expel,” Wang said. “So after we inhale Martian dust, a lot of it could remain in our lungs and be absorbed into our blood stream.”

An ounce of prevention

In the current study, Wang and several of his fellow medical students at USC scoured research papers to unearth the potential toxicological effects of the ingredients in Martian dust.

Some of what they found resembled common health problems on Earth. Dust on Mars, for example, contains large amounts of the compound silica, which is abundant in minerals on our own planet. People who inhale a lot of silica, such as glass blowers, can develop a condition known as silicosis. Their lung tissue becomes scarred, making it hard to breath—symptoms similar to the “black lung” disease that coal miners often contract. Currently, there is no cure for silicosis.

In other cases, the potential health consequences are much less well-known.

Martian dust carries large quantities of highly oxidizing compounds called perchlorates, which are made up of one chlorine and multiple oxygen atoms. Perchlorates are rare on Earth, but some evidence suggests that they can interfere with human thyroid function, leading to severe anemia. Even inhaling a few milligrams of perchlorates in Martian dust could be dangerous for astronauts.

Wang noted that the best time to prepare for the health risks of Martian dust is before humans ever make it to the planet. Iodine supplements, for example, would boost astronauts’ thyroid function, potentially counteracting the toll of perchlorates—although taking too much iodine can also, paradoxically, lead to thyroid disease. Filters specifically designed to screen out Martian dust could also help to keep the air in living spaces clean.

“Prevention is key. We tell everyone to go see their primary care provider to check your cholesterol before it gives you a heart attack,” Wang said. “The best thing we can do on Mars is make sure the astronauts aren’t exposed to dust in the first place.”

Co-authors of the current study include USC medical students Jeremy Rosenbaum, Ajay Prasad and Robert Raad; Esther Putnam, former graduate student in aerospace engineering sciences at CU Boulder now at SpaceX; Andrea Harrington at the NASA Johnson Space Center; and Haig Aintablian, director of the Space Medicine Program at the University of California, Los Angeles, also affiliated with SpaceX.

View Event →
Government Science Data May Soon Be Hidden. They’re Racing to Copy It.
Mar
21
2:00 PM14:00

Government Science Data May Soon Be Hidden. They’re Racing to Copy It.

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Gretchen Gehrke, an environmental scientist who helped found the Environmental Data and Governance Initiative, works at her home in Durham, N.C., March 17, 2025. Vast quantities of climate and environmental information have been removed from official websites in the past months, but scientists are trying keep it available. (Sebastian Siadecki/The New York Times)

By Austyn Gaffney | March 21, 2025

Amid the torrent of executive orders signed by President Trump were directives that affect the language on government web pages and the public’s access to government data touching on climate change, the environment, energy and public health.

In the past two months, hundreds of terabytes of digital resources analyzing data have been taken off government websites, and more are feared to be at risk of deletion. While in many cases the underlying data still exists, the tools that make it possible for the public and researchers to use that data have been removed.

But now, hundreds of volunteers are working to collect and download as much government data as possible and to recreate the digital tools that allow the public to access that information.

Read more here

View Event →
Greenpeace ordered to pay more than $660 million over Dakota Access Pipeline protests
Mar
20
2:30 PM14:30

Greenpeace ordered to pay more than $660 million over Dakota Access Pipeline protests

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

A Native American protester at a work site for the Dakota Access Pipeline near Cannon Ball, N.D., in 2016.Robyn Beck / AFP - Getty Images file

March 20, 2025, 2:53 AM PDT / Source: CNBC

By Sam Meredith, CNBC

A jury on Wednesday ordered environmental campaign group Greenpeace to pay more than $660 million in damages to Texas-based oil company Energy Transfer, the developer of the Dakota Access Pipeline.

A nine-person jury in Mandan, North Dakota, reached a verdict after roughly two days of deliberations. The outcome found Greenpeace liable for hundreds of millions of dollars over actions taken to prevent the construction of the Dakota Access Pipeline nearly a decade ago.

It marks an extraordinary legal blow for Greenpeace, which had previously warned that it could be forced into bankruptcy because of the case. The environmental advocacy group said it intends to appeal the verdict.

“This case should alarm everyone, no matter their political inclinations,” Greenpeace U.S. interim executive director Sushma Raman said in a statement published Wednesday.

“It’s part of a renewed push by corporations to weaponize our courts to silence dissent. We should all be concerned about the future of the First Amendment, and lawsuits like this aimed at destroying our rights to peaceful protest and free speech,” Raman said.

Greenpeace has described Energy Transfer’s case as a clear-cut example of SLAPPs, referring to a lawsuit designed to bury activist groups in legal fees and ultimately silence dissent. SLAPP is an acronym for “strategic lawsuit against public participation.”

Energy Transfer said the jury verdict was a “win” for “Americans who understand the difference between the right to free speech and breaking the law,” according to The Associated Press, citing a statement from the company.

“While we are pleased that Greenpeace has been held accountable for their actions against us, this win is really for the people of Mandan and throughout North Dakota who had to live through the daily harassment and disruptions caused by the protesters who were funded and trained by Greenpeace,” the company added.

A spokesperson for Energy Transfer was not immediately available to comment when contacted by CNBC on Thursday morning.

Read more here

View Event →
Whale makes epic migration, astonishing scientists
Mar
19
2:30 PM14:30

Whale makes epic migration, astonishing scientists

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Photo by Natalia Botero-Acosta | This humpback whale, photographed here off the Pacific coast of Colombia, made an epic migration

Helen Briggs | BBC environment correspondent

A humpback whale has made one of the longest and most unusual migrations ever recorded, possibly driven by climate change, scientists say.

It was seen in the Pacific Ocean off Colombia in 2017, then popped up several years later near Zanzibar in the Indian Ocean - a distance of at least 13,000 km.

The experts think this epic journey might be down to climate change depleting food stocks or perhaps an odyssey to find a mate.

Ekaterina Kalashnikova of the Tanzania Cetaceans Program said the feat was "truly impressive and unusual even for this highly migratory species".

The photograph below shows the same whale photographed in 2022, off the Zanzibar coast.

Dr Kalashnikova said it was very likely the longest distance a humpback whale had ever been recorded travelling.

Humpback whales live in all oceans around the world. They travel long distances every year and have one of the longest migrations of any mammal, swimming from tropical breeding grounds to feeding grounds in cooler waters.

But this male's journey was even more spectacular, involving two distant breeding grounds.

One theory is that climate change is altering the abundance of the tiny shrimplike krill humpback whales feed on, forcing them to travel further in search of food.

Alternatively, whales may be exploring new breeding grounds as populations rebound through global conservation efforts.

"While actual reasons are unknown, amongst the drivers there might be global changes in the climate, extreme environmental events (that are more frequent nowadays), and evolutionary mechanisms of the species," said Dr Kalashnikova.

Read more here

View Event →
We cannot ignore the climate crisis
Mar
16
10:30 AM10:30

We cannot ignore the climate crisis

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

by Letters to the Editor
March 16, 2025

To the Editor:

Before his appointment as U.S. Energy Secretary, Chris Wright was the CEO of Liberty Energy, North America’s second largest fracking company. Wright recently asserted:

“I am a climate realist. The Trump administration will treat climate change for what it is, a global physical phenomenon that is a side effect of building the modern world…The only interest group that we are concerned with is the American people.”

However, stopping Earth’s warming is not consistent with the Trump Administration’s agenda of “Drill, baby, drill!” and “American energy dominance.” While it’s true that some forms of fossil fuel generation are cleaner than others, they are all increasing the concentration of heat-trapping carbon in the atmosphere.

According to the Intergovernmental Panel on Climate Change, “Limiting global mean temperature increase at any level requires global CO2 emissions to become net zero at some point in the future.”

This means reducing carbon dioxide emissions enough that they are balanced by CO2 removal, such as being absorbed by forests and dissolved in the oceans. Otherwise the concentration of carbon dioxide in the atmosphere will continue to grow.

Disturbingly, the earth has already warmed to the point that, instead of absorbing and storing carbon dioxide, the planet’s carbon sinks are becoming sources of CO2 emissions. Warmer oceans are less able to take up carbon dioxide. Moreover, permafrost is thawing, and forests are burning.

These climate feedbacks indicate that we are losing our allies in nature that are essential to the climate fight.

In the words of Johan Rockström, director of the Potsdam Institute for Climate Impact Research, “Nature has so far balanced our abuse. This is coming to an end.”

And in a interview about the catastrophic fires in California, climate scientist Peter Kalmus warned:

“It’s not a new normal. A lot of climate messaging centers around this idea that it’s a new normal. It’s a staircase to a hotter, more hellish Earth.”

A recent report by the United Nations states that, without a greater commitment to reduce emissions, the Earth will warm by 3.1° C above pre-industrial levels by 2100. And the increase in global heating is expected to continue beyond the end of the century.

A World Bank report titled Turn Down the Heat, warns: “(A) global mean temperature increase of 4°C approaches the difference between temperatures today and those of the last ice age, when much of central Europe and the northern United States were covered with kilometers of ice and global mean temperatures were about 4.5°C to 7°C lower. And this magnitude of climate change—human induced—is occurring over a century, not millennia.”

This hotter climate is likely to have devastating consequences, such as the flooding of coastal cities, substantial reductions in crop yield, greatly increased water scarcity, major damage to seaports and the destruction of fisheries and coral reefs. Institutions that would normally support adaptation could collapse.

Not only does climate change threaten the foundation needed for human thriving, it disproportionately affects the world’s poorest nations and their most vulnerable citizens. A report by the United Nations Children’s Fund states: “The climate crisis is the defining human and child’s rights challenge of this generation, and is already having a devastating impact on the well-being of children globally.”

One billion children are deemed to be at “extremely high risk” from climate hazards like heat waves, drought and water scarcity. The majority live in less developed nations in Africa and South Asia that have contributed very little to this global problem. The ten countries where children are most at risk are responsible for only .5% of the world’s emissions.

According to the report, The Age of Consequences: The Foreign Policy and National Security Implications of Global Climate Change:

“The overwhelming message is that early steps to limit or mitigate climate change are essential, because longer-term efforts to adapt or anticipate may not be possible.”

Notably, the United States is the world’s greatest cumulative emitter, with historical emissions that are 71% more than second place China. Imagine if the Roman Empire had possessed the power to irreparably harm much of the life on earth, yet limited its concern for sustainability to just a few generations.

Furthermore, about half of the CO2 humans emit stays in the atmosphere for centuries or more. Consequently, the U.S. Fourth National Climate Assessment concludes:

“Climate change resulting from anthropogenic CO2 emissions and any associated risks to the environment, human health and society, are…essentially irreversible on human time scales.”

In his book, “A Perfect Moral Storm: The Ethical Tragedy of Climate Change,” Stephen M. Gardiner writes that, although climate change is usually discussed in scientific and economic terms, “the deepest challenge is ethical.” According to Gardiner: “What matters most is what we do to protect those vulnerable to our actions and unable to hold us accountable, especially the global poor, future generations and nonhuman nature.”

Reducing greenhouse gas emissions and funding adaptation is one of humanity’s greatest moral obligations. Even small changes in the trajectory of Earth’s warming could mean better lives for decades for many millions of people.

As the world’s most significant emitter and most powerful nation, America has a responsibility to embrace a leadership role in addressing the climate crisis.

Terry Hansen
Milwaukee, Wis.

View Event →
Red light for the greenway:  Locals oppose wildlife corridor at plutonium-contaminated Rocky Flats site
Mar
14
1:30 PM13:30

Red light for the greenway: Locals oppose wildlife corridor at plutonium-contaminated Rocky Flats site

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Rocky Flats workers inspect plutonium storage vault in building 707. They use the automated X-Y retriever to sort and retrieve plutonium metal from the storage vault for distribution to other processes in the building. Energy Department

A wildlife corridor plans to connect two Superfund sites at the former Rocky Flats plutonium plant and the Rocky Mountain Arsenal that once produced chemical weapons. Locals fear residual contamination could spread.

By John Abbotts March 14, 2025

n September, the city council of Westminster, Colorado voted not to fund a pedestrian bridge and underpass at the Rocky Flats site due to concerns about residual soil contamination from plutonium and other hazardous materials. In the process, the city council withdrew about $200,000 in financial support for the development of the project, known as the Rocky Mountain Greenway.

The US Fish and Wildlife Service proposed the greenway to connect wildlife refuges at the Rocky Mountain Arsenal through hiking trails via the Two Ponds refuge to Rocky Flats, with plans to eventually connect to the Rocky Mountain National Park. But the plan is controversial: Both Rocky Flats and the Arsenal are still on the US Environmental Protection Agency’s National Priorities List, identified since 1987 as “Superfund” cleanup sites that contain residual contamination.

The US Army established the Arsenal to produce chemical weapons to support World War II efforts, and in the 1990s, the federal government leased part of the Arsenal to Shell Chemical Co. to manufacture fertilizer and pesticides. In 1952, the Atomic Energy Commission began operations at Rocky Flats as a federal atomic weapons facility, producing plutonium triggers for hydrogen bombs. (A hydrogen bomb or H-bomb uses fission in the primary—uranium or plutonium—to trigger the secondary into a fusion reaction that combines two atomic nuclei to form a single heavier nucleus, releasing a much larger amount of energy.) Operations started largely in secret at Rocky Flats, located in a sparsely populated area 16 miles upwind and upslope of the city of Denver. But in the late 1970s, the public became more informed about plant operations, and the movement opposing atomic weapons began to focus on the facility, organizing protests and civil disobedience actions.

By the late 1980s, when the federal cleanup program at both sites had been initiated, work had already begun on the new Denver International Airport on Rocky Mountain Arsenal lands, and the Denver suburbs had steadily spread west toward Rocky Flats. Accordingly, there was consensus at each site that expedited cleanup would most effectively protect the metropolitan area, and cleanup standards were looser than “unrestricted use” to develop national wildlife refuges at each site. The consequences were residual contamination, especially at Rocky Flats, where there was no limit on how much plutonium remained below six feet of soil in an industrial area fenced off from the public and with the surrounding land converted to a wildlife refuge. This “cleanup on the cheap” at Rocky Flats, plus a record of cover-ups of accidents at the site, created continuing distrust and controversy over post-remediation uses near Rocky Flats. Cities and citizens opposed different proposals for re-use, even over the issue of public access to the refuge. Now there are concerns that the proposed greenway—a trail between the two tracts—may facilitate cross-contamination, taking radioactive material from the Rocky Flats site to the chemically hazardous Arsenal property, and vice versa.

Read more here

View Event →
NASA Analysis Shows Unexpected Amount of Sea Level Rise in 2024
Mar
13
10:30 AM10:30

NASA Analysis Shows Unexpected Amount of Sea Level Rise in 2024

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Communities in coastal areas such as Florida, shown in this 1992 NASA image, are vulnerable to the effects of sea level rise, including high-tide flooding. A new agency-led analysis found a higher-than-expected rate of sea level rise in 2024, which was also the hottest year on record. NASA

Last year’s increase was due to an unusual amount of ocean warming, combined with meltwater from land-based ice such as glaciers.

Global sea level rose faster than expected in 2024, mostly because of ocean water expanding as it warms, or thermal expansion. According to a NASA-led analysis, last year’s rate of rise was 0.23 inches (0.59 centimeters) per year, compared to the expected rate of 0.17 inches (0.43 centimeters) per year.

“The rise we saw in 2024 was higher than we expected,” said Josh Willis, a sea level researcher at NASA’s Jet Propulsion Laboratory in Southern California. “Every year is a little bit different, but what’s clear is that the ocean continues to rise, and the rate of rise is getting faster and faster.”

In recent years, about two-thirds of sea level rise was from the addition of water from land into the ocean by melting ice sheets and glaciers. About a third came from thermal expansion of seawater. But in 2024, those contributions flipped, with two-thirds of sea level rise coming from thermal expansion.

“With 2024 as the warmest year on record, Earth’s expanding oceans are following suit, reaching their highest levels in three decades,” said Nadya Vinogradova Shiffer, head of physical oceanography programs and the Integrated Earth System Observatory at NASA Headquarters in Washington.

Since the satellite record of ocean height began in 1993, the rate of annual sea level rise has more than doubled. In total, global sea level has gone up by 4 inches (10 centimeters) since 1993.

This long-term record is made possible by an uninterrupted series of ocean-observing satellites starting with TOPEX/Poseidon in 1992. The current ocean-observing satellite in that series, Sentinel-6 Michael Freilich, launched in 2020 and is one of an identical pair of spacecraft that will carry this sea level dataset into its fourth decade. Its twin, the upcoming Sentinel-6B satellite, will continue to measure sea surface height down to a few centimeters for about 90% of the world’s oceans.

Read more here

View Event →
‘The riskometer has been going up all the time’: Tim Lenton on tipping points
Mar
12
1:30 PM13:30

‘The riskometer has been going up all the time’: Tim Lenton on tipping points

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

The last house on Holland Island in the Chesapeake Bay before it collapsed in 2010 due to sea level rise. Many tipping elements in the Earth system can interact and exacerbate each other. For example, as the Greenland ice sheet melts, it raises global sea levels and could destabilize the West Antarctic ice sheet, making its collapse more likely. (Photo: baldeaglebluff, CC BY-SA 2.0)

By Jessica McKenzie | March 12, 2025

The following conversation with Tim Lenton, the founding director of the Global Systems Institute at the University of Exeter and lead author of the 2008 paper that formally introduced the idea of tipping points within the Earth’s climate system, is one of several interviews conducted for the March issue of the Bulletin of the Atomic Scientists, which is all about tipping points. You can find the other conversations here, and the rest of the magazine issue here. This interview has been lightly edited and condensed.

Jessica McKenzie: I was wondering if we could start by defining tipping points.

Tim Lenton: I define a tipping point as where you have a situation within some system where amplifying feedback within that system gets so strong that it can overwhelm whatever damping feedback you have in the system, and it creates a self-sustaining or self-propelling change within the system. I would hesitate to do this, but given that you’re the Bulletin of the Atomic Scientists, it would be pretty obvious to take the metaphor of a nuclear explosion as a case where you have runaway feedback. That’s a pretty catastrophic and extreme example, but for a scientist, it’s a classic example of uncontrolled, runaway, amplifying feedback situation. But in general, we’re talking about situations where the amplifying feedback may not be quite so catastrophic or uncontained, but it’s strong enough to support a fundamental change or shift in a system.

McKenzie: My understanding is that you were one of the first people to start to use tipping points in the context of climate science.

Lenton: Yes, with my friend John (Hans Joachim) Schellnhuber, in the early to mid 2000s. Another colleague, Jim Hansen, was also starting to use the language of tipping points. Everybody had probably read Malcolm Gladwell’s book “The Tipping Point,” and you had popular figures using the language quite a lot, so it wasn’t surprising that it found its way into climate discourse.

What John and I did—we didn’t want the language to be used too sloppily, so we went and tried to define what it would mean to have a tipping point in a climate system, and then try to identify the bits of the climate that could be tipped, which we decided to call tipping elements, because we thought of them as elements of the climate system. That was with a bunch of colleagues, also in the UK and the Potsdam Institute in Germany, which John directed for many years. We pulled a workshop together. I think it was either autumn 2004 or 2005, and at the workshop we started an expert elicitation process. We got other expert scientists to help us in our assessment of what the tipping elements were, how they could interact, and things like this. Out of that, I led on writing the paper published in 2008 that first put tipping elements on the map. It took a long time to produce, but it was worth it. It seems to be the one that ultimately set the tone for the field.

McKenzie: Could you tell me how the phrase tipping points maybe works better than what came before, this idea of climate surprises?

Lenton: There was language around abrupt climate change that was used in the scientific circles in the late 90s, 2000s, like climate surprises. If you read the [earlier] Intergovernmental Panel on Climate Change reports, they were using very arcane language. Let me try and get this right: “Large-scale discontinuities in the climate system,” which probably doesn’t mean much to anyone other than a climate specialist.

I’ve always felt that people are readily bamboozled by complexity, or complex systems, but if you have an intuitive way into complexity, it’s okay, because we’re used to living in a world of complexity. The concept of a tipping point can be made intuitive. I often use the example of, when you mess around as a kid, leaning back on your chair and exploring the tipping point where a small nudge can make a big difference to the fate of a system. That’s the tipping point, where you could end up flat on your back, on the floor, or back upright.

I think tipping points has worked partly because people can get the concept, and partly because it’s a faithful description of what can actually happen in systems. They can have this rare time when they’re very, very sensitive to a small nudge, and it’s going to make a big difference. It’s a nice confluence of a metaphor and a truly scientific meaning. And it’s trying to convey this concept of small change making a big difference. Change can be self-propelling. Change can be also very hard to reverse once it’s self-propelling, because you’re fighting against the system’s own momentum. And change, depending on the system, can, from a human perspective, seem quite abrupt. The timescale of change is a quality of a system itself, and some systems are fast systems, and some are slow. But nevertheless, all of that gets kind of nicely wrapped up in one thing that can be readily, easily grasped. And of course, one has to have a little bit of subtlety, like I just did. The speed of change is not always going to be as fast as falling over on your back on the chair.

McKenzie: How has the idea changed over the years?

Lenton: When we put out the first assessment, or synthesis, we were thinking we could identify a couple of systems that could be at risk, at relatively low levels of global warming—meaning one and a half to two degrees above pre-industrial—and some others, maybe another six or seven, that could be at risk at varying levels of more warming. Over time, as we gathered evidence from the real climate, from models, and all the rest of it, I expected some things to go off the map and maybe some other things to come on.

What tended to happen was, the more we learned, more came onto the map of things at risk, than went off the map. And the more we looked at it, assessments tended to, on average, bring the tipping points closer to us. The West Antarctica ice sheet is a classic example, but it’s also true for several others, like the Atlantic overturning circulation, and that happened over the last 15 odd years, as new evidence came in from the collective scientific enterprise trying monitor these systems more carefully, model them better.

If you think of a big riskometer, the riskometer has been going up all the time, the more we learned. Some things maybe went down, but overall, it definitely went up.

McKenzie: When you’re talking about tipping elements, or tipping points within the climate systems, how do you differentiate between the temperature tipping point, versus the moment that self-amplifying feedback kicks in?

Lenton: People always want certainty if they can get it. But in a complex system like the climate, you might have what we’d call an irreducible uncertainty. So even if there is a particular tipping point for a system—firstly, it’s hard to know what that is. Even if we knew what it was, there’s a level of internal variability in the climate. It fluctuates, and it’s got a stochastic or random quality to it. So you’re never going to be able to perfectly predict when you get across the tipping point. It’s deeper than saying you can’t really predict the weather, but this does bear some relation to that.

So, how do we try and get a handle on where the thresholds are? There are several lines of attack. You look at Earth history. Between the last two ice ages, the interglacial period, got in some places warmer than we are now, and that triggered the loss of a large chunk of the Greenland ice sheet, and in other interglacials, the loss of the West Antarctic Ice Sheet. So that gives us a historical anchor—some constraints—on those tipping points.

We also have models capturing what you might call the key reinforcing feedbacks of, let’s say, the loss of the Greenland ice sheet. And you run those models towards equilibrium at different levels of warming, and you see at some point, bam!—you get the tipping point, and Greenland can no longer be sustained. Of course, the subtlety is different models might give different answers to that tipping point. So then people like me try to synthesize that information and see, well, this model says here, this one says here: Here’s the range.

This is also true for our state of the art climate models. There’s quite a lot of tipping points going on in those models, even though they don’t capture everything, but they’ll often differ over what level of warming [results in tipping]. So you try to capture the range of that, and with enough information, it’s like having a distribution of estimates, of where the tipping point might be.

McKenzie: And then of course these tipping elements interact with each other, correct?

Lenton: Spot on Jessica. I just talked as if they were acting independently. But they are also contingent. They’re causally coupled, so the probabilities, or whatever you want to think of them as, are not independent. Meaning, if A has happened, maybe the probability of B has gone up. That’s certainly true for the melting of the Greenland ice sheet, making it more likely that the Atlantic overturning circulation will be tipped. But the converse is if the Atlantic overturning circulation tipped first, and Greenland hadn’t tipped, it would make tipping Greenland harder, or less likely, because it’s really a local tipping point of temperature, if you like, for Greenland, not a global one. It’s a beautiful, complex system, as usual.

The bad news is that the direction of those interactions tends to indicate that tipping one thing makes tipping another more likely, which then leads into a discourse about whether you could have a tipping cascade, which is a bit like a chain reaction, for the nuclear physicist reading, in the worst-case scenario. That would be a really extreme case, where tipping one thing made tipping another inevitable. We don’t think that that’s often the case. Thank whoever.

McKenzie: Is that what’s been called the hot house Earth scenario or runaway global warming?

Lenton: Yes, it is, although even though I was involved in that work, I think it is more realistic to talk about a “wet house Earth” or “water world,” where we would see large-scale breakdowns of ice sheets. Weirdly, or interestingly, if you destabilize the ice sheet on, say, Greenland, sea level rise will be biggest at the other pole, at Antarctica, and raising the sea levels is one way of destabilizing an ice sheet that floats on the ocean, in the case of West Antarctica.

The prospects for some kind of bad cascade, I see more as committing to a water world or redrawing coastlines. But if you tip the Atlantic overturning circulation, we know from Earth history, you tip the monsoons in West Africa and in India off, and that would be a catastrophe. So those are the genuine cases of connected or cascading tipping that are actually much more pertinent than runaway warming.

The real question we should ask is whether we can rule runaway warming out—because it’s a really bad scenario. It’s been known for a long time, in the broadest sense, that you can create a runaway greenhouse effect, and that’s a really interesting problem to do with why Venus is uninhabitable today.

But luckily, we’re a fair way away from triggering that tipping point. That’s basically where the Earth can’t keep itself cool by shedding long wave radiation from the surface to space, because the atmosphere is just too opaque, so full of water vapor, it won’t let the heat out. Then we’re screwed, full stop.

McKenzie: It’s like a wet bulb temperature for Earth…

Lenton: If the whole atmosphere is saturated and heat can’t escape the surface as fast as it’s coming in, then you’re in runaway [warming]. That’s quite hard to do, thankfully; usually you have some places where you have healthy circulation, dry columns of descending air in the atmosphere, and you’d have heat frantically trying to escape through those windows in the atmosphere.

But let’s turn away from that, because that’s a bigger-picture concern for any living things far in the future, hopefully.

McKenzie: Climate tipping points are now widely discussed within science and mass media. What has it been like watching the term take on a life of its own? And have there been places where you’re like, “oh, that’s not quite right, I wish that wasn’t adopted in that way?”

Lenton: There’s always meaning slippage.

If I step back a bit, I was hoping we were wrong 15-20 years ago. We were raising a flag because we were pretty sure this was a genuine risk and it was real, but you would have wanted to be wrong in the sense that we’d overestimated the risk. It would still have been a good service to provide. I feel there was a fundamental problem then and still now, to not see climate change as a risk management problem. We were really trying to shift people’s thinking.

I was told 15, 20 years ago, by many people, especially colleagues in climate science, to basically shut up. That this was gratuitous alarmism, catastrophism, and all the rest of it. But the sad thing is, reality intervened and started to show everybody that the climate is changing and faster and more convulsively than we thought. There’s little pleasure in being right when being right means we’re all facing bigger and more dangerous risks than we thought we were.

Of course, there’s still debate and discussion. Right now, I would observe, there’s a mixed discourse where inevitably there are people saying publicly, “well, this isn’t a very helpful framing of the problem.” And there are other people over-egging the pudding and maybe misusing the terminology. This is just what happens in this space of complex issues. We’re all trying to shift our way of thinking and get our brains around a complex situation. It’s deeply political, whether we like it or not. I, of course, would wish all the discourse to be as clear as it could be and as faithful to science as it can be. But I’m not an idiot. I know the world we live in, so I think the job of scientists like me is just to try to be very clear.

This is about a risk assessment approach. It’s not about a classic scientific approach, about what’s the most likely thing to happen. This is not that. This is a risk assessment. This is asking, what are some of the worst things that could happen, and what can we do to control those risks?

At least it’s on the collective radar now. Hopefully we’re getting some shift at the deepest level in people’s worldviews about the nature of the problem we’re facing. We’re 400 years into the scientific revolution of being told that the world is like a clockwork machine and outputs are proportional to inputs. But we’ve all lived through a couple of decades of experience where we realize the world is not a clockwork machine, and outputs are definitely not proportionate to inputs, and things can go very nonlinear. We’re all in a bit of mental recalibration. The dominant cultural worldview is being repeatedly challenged by reality. That’s my philosophical take on it, I suppose.

McKenzie: I understand that you’ve turned your attention to positive tipping points. Can you tell me a little bit about why you wanted to do that?

Lenton: If you’re one of the world experts on the risks, and you can see how risky the risks look, you do a certain amount of work, and you’re like: “Okay, I know enough about the problem to know how bad it is. I’m not going to gain a lot, and the world’s not going to gain a lot, if I just continue to diagnose the problem. Can I find any credible grounds of hope that there is a way for us, humans, collectively, and our technology, and society, to change fast enough in a direction that would limit these existential risks?” That’s where I got on a scientific journey, trying to see if I could find the evidence, as well as the mechanisms, for self-propelling change towards zero greenhouse gas emissions, and to stop the net destruction of nature. It took a bit of work, it took empirical examples at country scale in major sectors of the economy starting to tip for me to convince myself that this was credible.

I’m just trying to bring what skill set or toolkit I might have to help here as a public good. It’s obvious that we’re not acting fast enough on climate change. There is some action in the right direction. It just needs to go about an order of magnitude faster. And if you need to achieve a radical acceleration of action, you’ve got to look at strong amplifying feedbacks within systems and ask yourself, what can activate them and make them stronger? And that led me on a journey, because I began to see the role that social activists and social movements had played in tipping technology changes, like electric vehicles in Norway, or the global solar panel revolution, or wind power offshore Denmark.

One of my distant relatives was a famous suffragette, so I have this kind of pride in her, and in that history of a very small group of people who completely changed social norms for everybody, and a bunch of connections started to click into place for me. So I ended up writing a book about positive tipping points, which is in production.

I mean, what are scientists? Apart from being motivated by the beauty of nature and wanting to understand it, you’re kind of a professional problem solver. So you see this enormous problem, climate change, which you’re busy diagnosing and telling everybody about, but part of your instinct is to try and solve problems. I think that’s how I remain sane at the same time—not just shut up shop basically and go into denialism or despair.

McKenzie: If this is the one thing someone is reading about tipping points, what would you want them to take away from this conversation?

Lenton: It would be that we’ve all got some agency to be part of triggering positive tipping points that can accelerate us out of the existential trouble that’s otherwise going to be caused by the bad tipping points in the climate and the Earth system.

It’s a different problem than escalating nuclear war or whatever. That’s for sure. But in some ways, possibly in a more empowering way—I mean, we could all protest rightly against nuclear escalation, but here there are actually more ways in which we have agency to change the outcome. No one’s denying it isn’t a big and messy problem, but at the same time, think of all the other benefits. That’s the other thing that never gets stressed. Cleaner air, cleaner water, and better mental health and happier kids. There’s everything to gain.

View Event →
‘Metaphors can grow legs’: David Armstrong McKay on tipping points
Mar
12
1:30 PM13:30

‘Metaphors can grow legs’: David Armstrong McKay on tipping points

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Inside the ice core storage area at the National Ice Core Laboratory. Ice cores from Greenland show abrupt temperature shifts in the Earth's history, information that informs tipping points research today. (Photo: Eric Cravens, Assistant Curator, National Ice Core Lab)

By Jessica McKenzie | March 12, 2025

The following conversation with David Armstrong McKay, a lecturer and researcher at the University of Sussex and the lead author of a 2022 paper that reassessed the risk of all the tipping elements in the Earth system, is one of several interviews conducted for the March issue of the Bulletin of the Atomic Scientists, which is all about tipping points. You can find the other conversations here, and the rest of the magazine issue here. This interview has been lightly edited and condensed.

Jessica McKenzie: Could you start by defining tipping points?

David Armstrong McKay: The definition that we’ve been using is when change in a system becomes self-sustaining once the system has been pushed beyond a particular threshold, and that triggers a kind of state shift, so the system will completely change state in a way which is often abrupt or irreversible, but not necessarily.

McKenzie: And where did the term come from, what’s its history?

McKay: It’s an interesting combination of ideas that sort of merged together over time. The phrase itself, ‘tipping point,’ was popularized, but not coined, by Malcolm Gladwell in his book. That was in 2000, and for a few years it was used kind of informally. It was starting to be picked up by climate scientists around 2004-2005, but it didn’t enter the scientific literature for a few years. It was only really with the 2008 paper that my colleague Tim Lenton led that it became scientifically formalized. In the years before, it was just people using it as a phrase, not necessarily citing Malcolm Gladwell’s book, but just picking up on the fact that it was becoming quite a common term. There was kind of a tipping point around the use of the phrase ‘tipping point’ with that book. The actual phrase itself goes back further to, I think it was paper in the 1950s to do with racial segregation. The term had a darker history before it was popularized by Malcolm Gladwell.

McKenzie: These ideas that are now associated with climate tipping points, those are not necessarily new; “climate surprises” became tipping points. Could you say more about the history of this idea that the climate system is unstable, and that there are different phases or states?

McKay: Some of that comes from geology. It used to be thought that pretty much everything had been formed in the flood, and then a lot of 19th and 20th century geologists said “no, things happen with gradual layers. Everything happens gradually. It’s this slow, uniform process.”

Sometime in the 1960s, there started to be more fine-grained data and evidence coming from geology and the Earth sciences more broadly, particularly from ice cores out of Greenland and sediment cores from the ocean, that started to show that there were these abrupt shifts in the past, particularly around the last glacial period, the last peak Ice Age. The end of the last glacial period was surprisingly fast. They were expecting, based on this uniformitarian, old-school view, that it should have been a really gradual up and down, of the glacial and interglacial periods. And actually, it turns out to be quite a fast process.

McKenzie: And when you say fast—what is fast?

McKay: Geologically fast. But sometimes actually fast. If you look at some of the Greenland cores, there were some time periods where you could get 10-degrees of warming over a few decades, or something like that, which is actually really fast. But on the global scale, we’re talking more like a few 100 years to 1,000 years, which, to a geologist, is fast.

McKenzie: Ten degrees of warming over a couple decades sounds pretty fast to me! How have the ideas around tipping points changed over time?

McKay: A lot of the shift has to do with how it’s been defined and refining the definition. A lot of the stuff about climate surprises were to do with either abrupt shifts or irreversible shifts, and they’re still kind of somewhat mixed together. Even in the last IPCC report, they have this table which has abrupt shifts, irreversible shifts, and tipping points, all within the same category. Originally, particularly if you’re using the Gladwellian definition, that emphasizes the abruptness. But when you’re looking at something like an ice sheet, that can have a self-sustaining-change type tipping point. But on a human time scale, that’s not abrupt to us. It’s abrupt to the ice sheet. If the Greenland ice sheet collapses within 1,000 years, potentially, that is abrupt for an ice sheet, but it’s not abrupt for us.

Over time a large part of the community has been focusing more on this self-sustaining change dynamic, which often leads to abrupt shifts, but not necessarily abrupt. Not necessarily abrupt from a human perspective, at least. So that’s the definition that we’ve been using. But it’s not universal. There are people who still prefer to use the kind of abrupt focus definition, because it taps into the imagery that most people have when they think of a tipping point. There are differences in approach there, but I think over time, mostly it’s that self-sustaining definition that people have picked up.

McKenzie: Can you talk about why the term is a bit controversial or divisive?

McKay: This goes back to when it was first being suggested and discussed, in the early 2000s. There was this worry that the phrase undersells the uncertainty around these thresholds. It suggests that it could be possible to identify with high precision a particular number, a particular warming level, that if we go beyond that, we know that tipping will definitely happen, when actually, if you look at it, and if you look our 2022 paper, there are these really big ranges or potential thresholds. Even some of the better-defined ones, like the Greenland ice sheet, the range is between 0.8 and three degrees of warming, with collapse becoming more likely as you go up through that range. But we really don’t know exactly when it will happen.

So tipping points are sometimes implied to be far more definite than they are. You can say, well, actually, there’s a lot of uncertainty. But then people ask, if there’s that much uncertainty, is it useful?

I would argue it’s still useful, because it does have that extra sense of irreversible lock-in of some of these changes that are being made. But it does complicate the picture a lot. Sometimes people do portray it as far more definite than they should.

I think the other side of it is that people will look at that threshold and think, the only thing we should be doing is throwing absolutely everything at trying to stop it, which could encourage something like an emergency intervention, geoengineering, that sort of thing. Or if we’re close to going over it, or we go over it, then the flip side is fatalism, people think there’s nothing we can do. All we can do is adapt, and maybe we can’t even adapt, because the changes will be so big, especially if people assume that it’s a global climate tipping point, rather than tipping points in smaller parts of the climate system. Those are the main issues.

McKenzie: Maybe it’s because I’ve been reading so much about it, but it seems so obvious that it’s something within each individual system, and not a single, specific temperature.

McKay: It’s sometimes frustrating. There are not very many papers that suggest there’s one global number at which point there’s runaway global warming, but that’s how it’s often interpreted. And to be fair, there are papers that kind of suggest that, and there are people who have done public science communication who have implied that, and that doesn’t help. That kind of muddies the whole concept to a point where it’s not usable.

McKenzie: What is the connection between tipping points and the runaway greenhouse effect, or the hothouse Earth scenario. Is there any scientific credence to that scenario at all, right now?

McKay: There is a link, but it is a hypothesis. It’s still speculative. And the issue I have with it is that sometimes it is presented as kind of a definite, proven thing, or at least there’s lots of evidence for it, whereas it still remains a hypothesis that is being tested and doesn’t have a huge amount of evidence. The 2018 paper that proposed it suggested that there are various feedbacks and tipping points that could basically, once you got to, they speculated two degrees of warming, that would cause more warming, which would trigger more warming, and so on. And that would cause us to drift towards four degrees of warming over the course of 1,000 years, or something like that. The implication was that there was some kind of emergent global tipping point coming out of some of these smaller scale tipping points and feedbacks and they would kind of add up to a global tipping point.

At the moment, we don’t actually have evidence supporting that. It’s something that needs to be explored and tested. But as it stands, there are other hypotheses for why, for example, the Holocene was so stable. That is part of the argument they make in that paper, saying the Holocene is stable because it was in one of these stable states, and that we can be knocked out of that stable state into a hotter one. There are alternative hypotheses for why the Holocene was stable that don’t need that stable state. And we don’t actually know if there’s a stable state at four degrees of warming that we could be knocked into. The idea needs a lot more evidence and discussion before they’re presented as definite things that should guide global policy.

McKenzie: If this was the only thing someone were to read about tipping points, what would you want their takeaway to be?

McKay: The thing I always want people to take away is to not just to appreciate the urgency, because there is urgency with tipping points even though these things are highly uncertain. They might lock in these big future changes, big shifts, for generations. But we always need to bear in mind that it’s never too late. Even if we do pass one of these tipping points, or several tipping points, that doesn’t mean that we give up, because we can still prevent further tipping points. It doesn’t kick off this runaway warming scenario, as far as we know. Each one that is triggered will lock in some level of damage, and that’s bad. We should avoid that, but we can still prevent further ones, and that’s what I would want people to come away with, so they don’t come away with this sense that there is a tipping point, and if we pass it, then it means all hope is lost. That’s not what we’re trying to say here.

McKenzie: It’s definitely a tricky communications line to walk.

McKay: It’s useful, but it’s very easy to misinterpret. That’s kind of the issue with it. It’s a powerful metaphor, but metaphors can grow legs and run in different directions to what you want.

View Event →
‘Notoriously difficult to investigate and even more difficult to predict’: Thomas Stocker on tipping points
Mar
12
1:30 PM13:30

‘Notoriously difficult to investigate and even more difficult to predict’: Thomas Stocker on tipping points

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Tipping points at the regional level—like drastic changes to India's monsoon season—can be just as significant as global tipping points. (Photo: Rajarshi Mitra/Flickr, CC BY 2.0)

By Jessica McKenzie | March 12, 2025

The following conversation with Thomas Stocker, a professor of Climate and Environmental Physics at the University of Bern and lead author of a 2024 article that argued for the necessity of a robust assessment of tipping points by the Intergovernmental Panel on Climate Change to establish much needed consensus on the topic, is one of several interviews conducted for the March issue of the Bulletin of the Atomic Scientists, which is all about tipping points. You can find the other conversations here, and the rest of the magazine issue here. This interview has been lightly edited and condensed.

Jessica McKenzie: Could we start by defining tipping points?

Thomas Stocker: There is, of course, the mathematical definition of a tipping point, that’s quite clear. It’s dynamical systems that have more than one stable equilibrium, and where these transitions occur from one equilibrium to the next—when you change an external parameter or a forcing or whatever—that point is called a tipping point.

Usually it’s associated with hysteresis—I don’t know whether you’re familiar with that notion. It is the physical basis of magnetic information storage: Magnetic moments (the strength and orientation of a magnetic source) orient along an external magnetic field, and if that external field is switched off, the moments remain oriented due to their magnetic interaction. It requires an external magnetic field in the opposite direction to switch orientation. It turns out that hysteresis, very similar to the physical phenomenon, occurs in many non-linear dynamical systems. Hysteretic behavior has also been shown in earth system models across the entire hierarchy of model complexity, from very simple models to the comprehensive coupled general circulation models.

In the public discussion, ‘tipping point’ has assumed a much broader meaning. In earlier IPCC [Intergovernmental Panel on Climate Change] assessments, we called them “surprises in the climate system,” where, due to slow changes in certain parameters or in certain forcing, like the CO2 or greenhouse gas concentrations in the atmosphere, the system would react by crossing such tipping points. The classical one is the Atlantic Meridional Overturning Circulation, which was really coming into the focus in the mid-1980s, when ice cores from Greenland showed a very dynamical behavior of the climate: A sequence of abrupt warmings and coolings in the temperature indicators measured on the Greenland ice cores. This suggested that the ocean circulation, bringing heat northward in the Atlantic, may be responsible for that. A reduction of this transport would result in a cooling, while a rapid resumption of the circulation could generate an abrupt warming.

This is still a very actively researched topic, now in the context of crossing possible tipping points due to anthropogenic heating. By now we have dedicated observational networks in the Atlantic Ocean, to monitor the behavior of this meridional overturning circulation and eventually provide information as to whether that system does show an approach towards a tipping point.

I think tipping points are equally relevant at a much smaller scale than the global-scale tipping points associated with AMOC, the West Antarctic ice sheet, or others. Consider precipitation characteristics (amount, timing, extent), or water resources in general. If a community dependent on the monsoon system, say in India, is experiencing a fundamental step change in the availability of water supplies, they would certainly interpret such a change as the crossing of a tipping point. In the future, regional tipping points and their impacts will require much more attention.

McKenzie: One thing that I wished I had gotten into this issue was something on biodiversity.

Stocker: Absolutely, the tipping point concept is not reserved to the physical sciences. In fact, equally important are all complex systems that support habitability on this planet. The well-functioning of ecosystems is absolutely central to our survival. Consider an ecosystem with multiple participants that all have different sensitivities to local climatic conditions. Responding to the warming, the more sensitive species will migrate to colder locations (if they can), the less sensitive stay where they are. This way you could lose key elements of an ecosystem, just as if you tore apart the ecosystem. This could ultimately lead to the gradual dysfunction of an ecosystem, culminating in its eventual loss.

McKenzie: The history of tipping points can be hard to tease out. I found it difficult to determine whether climate scientists were using the phrase before Malcolm Gladwell popularized the term.

Stocker: It came later. When I was a postdoc in the late 1980s, early 1990s, I was researching the North Atlantic circulation, and I was very interested in that system and its vulnerability to perturbations. One finding with a climate model of reduced complexity was that this circulation exhibited two stable states: a strong circulation and a weak or collapsed one. This is the essence of hysteresis. The driving parameter was the Atlantic freshwater balance, precipitation minus evaporation, that induced the crossing of a tipping point. Our results provided a link between very simple box models, presented in the early 1960s and the simulations with a state-of-the-art climate model by [Syukuro] Suki Manabe, the Nobel Laureate of the Physics Nobel Prize in 2021, with whom we were in very lively and engaged contact.

But we never mentioned tipping points then. We said, “might the climate system have multiple equilibria, and might that actually lead to surprises in the climate system?” Already in 2001, in the third IPCC assessment, we talked about surprises in the climate system. The tipping point vocabulary came later, around 2008, with the article by Tim Lenton. That notion quickly gained traction and resonated within the community. It was a concept that, it seems, proved easier to communicate to a wider audience than complex ideas like multiple equilibria or abrupt climate change. Everybody had some kind of imagination of tipping points.

McKenzie: It’s a very evocative phrase. How has the understanding of tipping points changed since 2008, when it was introduced? Has it evolved significantly?

Stocker: Absolutely. We’ve seen quite an activity of research worldwide in this topic, with the newer generation of the coupled climate models that offered better resolution and more realism. There have been high times and low times of tipping point research, with the community vacillating between, “yes, it’s important, yes, it’s prevalent,” or “maybe not—it’s a complicated system, and therefore tipping is perhaps not so prominent.” And so, the scientific community has not yet fully come to terms with it.

What I perceive now is that there is a gap opening or widening between those colleagues—respected colleagues—who say, well, “tipping is probably not the primary issue in the climate system,” and others who find all kinds of indicators that things may tip. You can see this in what the high-profile journals publish at a regular pace: One paper will argue things are on the way to tipping, and then subsequent papers say, “Ah, no, but looking at that indicator, we don’t see that the system is already tipping.”

It’s a very lively scientific debate, and it is a debate that can become confusing for the public. Therefore, a very careful assessment is required. That’s why I proposed to IPCC to address this issue head on in the seventh cycle that started last year. Switzerland promoted the proposal of a special report that would allow the scientific community to congregate under that formal umbrella with the task of finding consensus on these different possible tipping elements: the observational evidence for it, the paleoclimatic evidence for it, the evidence in model projections, and then carry out a detailed scientific assessment in the very formalized framework of a special report. Now unfortunately, last year, IPCC decided not to commission any new special reports for the seventh cycle.

McKenzie: Do they usually do one?

Stocker: The sixth cycle did three different special reports. It’s always, of course, a significant burden to the scientific community, but it’s also a unique opportunity to shed light on complex issues that the policy makers are interested in. This time around, they said, “Well, we already have one special report in the pipeline on climate change and cities, and so we have no capacity for any other special reports,” and so that was brushed off the table.

McKenzie: How long would it be before you could revisit?

Stocker: This cycle will last for another five years or so.

McKenzie: So, IPCC won’t do anything on tipping points for at least five years…

Stocker: Well, in December, they had the so-called scoping meeting, which is where policy makers and experts get together and discuss what chapters will be defined in the three respective reports, on the physical science, the impacts, and mitigation. They proposed a chapter on tipping points and irreversibility. And this has now been confirmed by the 62nd Plenary of the IPCC. This is sort of a second-best option, in which you have an entire chapter dedicated to that issue, which allows you to invite a good number of leading experts. Their assessment work will surely generate a kind of a consensus around these important questions.It may also be that in some of the aspects of that question, there will not be a consensus, and they will have to report, “we are split on this, and we cannot really say for these and these reasons, and here is the research strategy that would be needed to address that more comprehensively and more conclusively.” Although there was some pushback on the term “tipping point,” it has been retained in a somewhat weakened form in the following chapter title “Abrupt changes, low-likelihood high impact events and critical thresholds, including tipping points, in the Earth system.”

McKenzie: What are the key questions that you would hope the IPCC report would address?

Stocker: For that chapter, I would hope that they would, in a systematic way, go through the literature, and address all the proposed tipping elements to date. So: AMOC, West Antarctic ice sheets, Amazonian rainforest, boreal permafrost, monsoon systems, etc. Then, home in on what that means, if such a point were crossed, what impact this would have on the regional climate. That’s ultimately what we need to inform the public and the stakeholders about, regional climate, but all encompassing, not just the sort of the mean climate, but the weather aspects of the climate. In other words, the statistics of extreme events associated with the climate state or a regional climate state, and how that changes if a tipping point is crossed.

At the more fundamental level, I would hope that this assessment would reach a consensus on the likelihood of tipping. You know, it could be that they say, so far, we don’t really have evidence in the 21st century for AMOC to tip, but there is stronger evidence even in previous climates and under certain future scenarios, models indicate we have had, or could have, tipping. So really, a very careful and profound assessment on likelihoods and on uncertainties associated with tipping is required.

I am aware that it’s a difficult topic, and I suspect that this was part of the reason why, in the end, [the IPCC] decided not to have a special report, apart from the principal decision not to want special reports anymore, because it’s too much of a burden on the scientists. Tipping points is indeed a difficult topic, but on the other hand, you could say assessing climate sensitivities is difficult, and IPCC has addressed that in a very persistent and consistent way in all six previous assessments. This process allowed for the evolution of the consensus. Every assessment is a snapshot of the science. Successive assessments therefore map the evolution of climate science, and with it strengthen consensus on many issues.

McKenzie: If this is the only thing someone’s going to read about tipping points, what’s the main thing that they should take away?

Stocker: Going back to the very title of your publication, the Bulletin of the Atomic Scientists, I think it’s very important to say that what we are doing here is physics. It’s physics of the atmosphere, physics of the ocean, physics of the climate system. That is also conveyed by the title of the IPCC assessments of the first working group, which says we are assessing the physical science basis of climate change. When I give lectures on climate change, people are really surprised that climate modeling has received a Nobel Prize—not in statistics, that doesn’t exist—but in physics. Many people think these climate models are just statistical machines that make some extrapolation, and there’s really large uncertainties. We hear such statements constantly, fed and fueled by climate skeptics or climate deniers.

But look at weather forecasts. Don’t you consult the weather forecast every day? Aren’t these quite reliable? Aren’t we predicting the pathways of hurricanes and warning thousands of people in good time? That’s all physics. And of course, it’s uncertain, but it’s information that is crucial. The same holds true for future climate change, the same holds true for the tipping points. Ultimately, it is physics, and as it is with physics, instabilities are notoriously difficult to investigate and even more difficult to predict.

View Event →
Climate Group Funded by Bill Gates Slashes Staff in Major Retreat
Mar
12
11:30 AM11:30

Climate Group Funded by Bill Gates Slashes Staff in Major Retreat

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Bill Gates, Breakthrough Energy’s founder, speaking at a climate and growth summit in Paris in 2023.Credit...Stephanie Lecocq/Reuters

Breakthrough Energy, an umbrella organization funded by Bill Gates that works on a sprawling range of climate issues, announced deep cuts to its operations in an internal memo on Tuesday.

Dozens of staff members were cut, including Breakthrough Energy’s unit in Europe, its team in the United States working on public policy issues and most of its employees working on partnerships with other climate organizations, according to three people familiar with the matter who were not authorized to speak publicly.

The change shows how Mr. Gates is retooling his empire for the Trump era. With Republicans controlling both houses of Congress and the White House, Mr. Gates calculated that the Breakthrough policy team in the United States was not likely to have a significant effect in Washington, said the people familiar with his thinking. The U.S. policy team was also one of the largest and most expensive parts of the organization.

“Bill Gates remains as committed as ever to advancing the clean energy innovations needed to address climate change,” a spokeswoman for Mr. Gates said in a statement when asked about the cuts.”

Read more here

View Event →
The UK fears environmental damage as ships burn after North Sea collision
Mar
12
11:30 AM11:30

The UK fears environmental damage as ships burn after North Sea collision

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Rescue crews work on site after a cargo ship was hit by a tanker carrying jet fuel for the US military off eastern England

AP

LONDON -- British officials were concerned about possible environmental damage Tuesday and looking for answers a day after a cargo ship carrying a toxic chemical hit a tanker transporting jet fuel for the U.S. military off eastern England, setting both vessels ablaze.

Jet fuel from a ruptured tank poured into the North Sea after the Portugal-registered container ship Solong broadsided the U.S-flagged tanker MV Stena Immaculate on Monday. The collision sparked explosions and fires that burned for 24 hours. Footage filmed from a helicopter on Tuesday morning showed the fire appeared to be out on the tanker, which had a large gash on its port side.

British government minister Matthew Pennycook said it was a “fast-moving and dynamic situation.”

He said air quality readings were normal and the coast guards “are well-equipped to contain and disperse any oil spills,” with equipment including booms deployed from vessels to stop oil spreading, and aircraft that can spray dispersants on a spill.

The collision triggered a major rescue operation by lifeboats, coast guard aircraft and commercial vessels in the foggy North Sea.

All but one of the 37 crew members from the two vessels were brought ashore in the port of Grimsby, about 150 miles (240 kilometers) north of London, with one hospitalized. One crew member was missing, and the coast guards suspended the search late Monday.

U.K. Marine accident investigators have begun gathering evidence of what caused the Solong, bound from Grangemouth in Scotland to Rotterdam in the Netherlands, to hit the stationary tanker, which was anchored some 10 miles (16 kilometers) off the English coast.

The investigation will be led by the U.S. and Portugal, the countries where the vessels are flagged.

The 183 meter (596 foot) Stena Immaculate was operating as part of the U.S. government’s Tanker Security Program, a group of commercial vessels that can be contracted to carry fuel for the military when needed. Its operator, U.S.-based maritime management firm Crowley, said it was carrying 220,000 barrels of Jet-A1 fuel in 16 tanks, at least one of which was ruptured.

The company said it was unclear how much fuel had leaked into the sea.

The Solong’s cargo included sodium cyanide, which can produce harmful gas when combined with water, according to industry publication Lloyd’s List Intelligence. It was unclear if there had been a leak.

Greenpeace U.K. said it was too early to assess the extent of any environmental damage from the collision, which took place near busy fishing grounds and major seabird colonies.

Environmentalists said oil and chemicals posed a risk to sea life including whales and dolphins and to birds, including puffins, gannets and guillemots that live on coastal cliffs.

Tom Webb, senior lecturer in marine ecology and conservation at the University of Sheffield, said wildlife along that stretch of coast “is of immense biological, cultural and economic importance.”

“In addition to the wealth of marine life that is present all year round, this time of the year is crucial for many migratory species," he said.

Alex Lukyanov, who models oil spills at the University of Reading, said the environmental impact would depend on multiple factors, including “the size of the spill, weather conditions, sea currents, water waves, wind patterns and the type of oil involved.”

“This particular incident is troubling because it appears to involve persistent oil, which breaks up slowly in water,” he said. “The environmental toll could be severe.”

Read more here

View Event →
Trump officials decimate climate protections and consider axeing key greenhouse gas finding
Mar
12
10:30 AM10:30

Trump officials decimate climate protections and consider axeing key greenhouse gas finding

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Emissions billow from the Phillips 66 refinery in Linden, New Jersey, on 6 February 2024. Photograph: Gary Hershorn/Getty Images.

Oliver Milman

Wed 12 Mar 2025 18.59 EDTFirst published on Wed 12 Mar 2025 16.21 EDT

Donald Trump’s administration is to reconsider the official finding that greenhouse gases are harmful to public health, a move that threatens to rip apart the foundation of the US’s climate laws, amid a stunning barrage of actions to weaken or repeal a host of pollution limits upon power plants, cars and waterways.

Trump’s Environmental Protection Agency (EPA) issued an extraordinary cavalcade of pollution rule rollbacks on Wednesday, led by the announcement it would potentially scrap a landmark 2009 finding by the US government that planet-heating gases, such carbon dioxide, pose a threat to human health.

The so-called endangerment finding, which followed a supreme court ruling that the EPA could regulate greenhouse gases, provides the underpinning for all rules aimed at cutting the pollution that scientists have unequivocally found is worsening the climate crisis.

Despite the enormous and growing body of evidence of devastation caused by rising emissions, including trillions of dollars in economic costs, Trump has called the climate crisis a “hoax” and dismissed those concerned by its worsening impacts as “climate lunatics”.

Lee Zeldin, the EPA administrator, said the agency would reconsider the endangerment finding due to concerns that it had spawned “an agenda that throttles our industries, our mobility, and our consumer choice while benefiting adversaries overseas”.

Zeldin wrote that Wednesday was the “most consequential day of deregulation in American history” and that “we are driving a dagger through the heart of climate-change religion and ushering in America’s Golden Age.”

Zeldin boasted about the changes and said his agency’s mission was to “lower the cost of buying a car, heating a home and running a business”.

Environmentalists reacted with horror to the announcement and vowed to defend the overwhelming findings of science and the US’s ability to address the climate crisis through the courts, which regularly struck down Trump’s rollbacks in his first term. “The Trump administration’s ignorance is trumped only by its malice toward the planet,” said Jason Rylander, legal director at the Center for Biological Diversity’s Climate Law Institute.

Come hell or high water, raging fires and deadly heatwaves, Trump and his cronies are bent on putting polluter profits ahead of people’s lives. This move won’t stand up in court. We’re going to fight it every step of the way.”

In all, the EPA issued 31 announcements within just a few hours that take aim at almost every major environmental rule designed to protect Americans’ clean air and water, as well as a livable climate.

The barrage included a move to overturn a Biden-era plan to slash pollution spewing from coal-fired power plants, which itself was a reduced version of an Obama administration initiative that was struck down by the supreme court.

The EPA will also revisit pollution standards for cars and trucks, which Zeldin said had imposed a “crushing regulatory regime” upon auto companies that are now shifting towards electric vehicles; considering weakening rules limiting sooty air pollution that is linked to an array of health problems; potentially axeing requirements that power plants not befoul waterways or dump their toxic waste; and considering further narrowing how it implements the Clean Water Act in general.

The stunning broadside of actions against pollution rules could, if upheld by the courts, reshape Americans’ environment in ways not seen since major legislation was passed in the 1970s to end an era of smoggy skies and burning rivers that became the norm following American industrialization.

Pollutants from power plants, highways and industry cause a range of heart, lung and other health problems, with greenhouse gases among this pollution driving up the global temperature and fueling catastrophic heatwaves, floods, storms and other impacts.

Zeldin’s EPA is dragging America back to the days before the Clean Air Act, when people were dying from pollution,” said Dominique Browning, director of the Moms Clean Air Force. “This is unacceptable. And shameful. We will oppose with all our hearts to protect our children from this cruel, monstrous action.”

The EPA’s moves come shortly after its decision to shutter all its offices that deal with addressing the disproportionate burden of pollution faced by poor people and minorities in the US, amid a mass firing of agency staff. Zeldin has also instructed that $20bn in grants to help address the climate crisis be halted, citing potential fraud. Democrats have questioned whether these moves are legal.

Former EPA staff have reacted with shock to the upending of the agency.

“Today marks the most disastrous day in EPA history,” said Gina McCarthy, who was EPA administrator under Obama. “Rolling these rules back is not just a disgrace, it’s a threat to all of us. The agency has fully abdicated its mission to protect Americans’ health and wellbeing.”

The Trump administration has promised additional environmental rollbacks in the coming weeks. The Energy Dominance Council that the president established last month is looking to eliminate a vast array of regulations in an effort to boost the fossil fuel industry, the interior secretary, Doug Burgum, told the oil and gas conference CeraWeek in Houston on Wednesday. “We will come up with the ways that we can cut red tape,” he said. “We can easily get rid of 20-30% of our regulations.”

Read more here

View Event →
IPCC calls for the nomination of authors for the Seventh Assessment Report
Mar
11
1:00 PM13:00

IPCC calls for the nomination of authors for the Seventh Assessment Report

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

GENEVA, Mar 11 – The Intergovernmental Panel on Climate Change is calling for nominations of experts to act as Coordinating Lead Authors, Lead Authors, or Review Editors for the three key Working Group contributions to IPCC´s Seventh Assessment Report (AR7). This follows the Panel’s agreement on the outlines of the three Working Group contributions during its 62nd Session held in Hangzhou, China.

Hundreds of experts around the world in different scientific domains volunteer their time and expertise to produce the reports of the IPCC. Author teams reflect a range of scientific, technical and socio-economic expertise. Coordinating Lead Authors and Lead Authors are responsible for drafting the different chapters of the Working Group contributions to the AR7 and, with the help of the Review Editors, revising those based on comments submitted during the two rounds of reviews by experts and governments.

“Our priority for the Seventh Assessment Report is to attract the most talented individuals across the whole spectrum of scientific, technical and socio-economic research. We would like to see balanced author teams involving both established experts and younger scientists new to the IPCC. It is essential that we reflect fully the breadth and depth of knowledge on climate change and climate action” said IPCC Chair Jim Skea.

IPCC author teams include a mix of experts from different regions to ensure geographic balance. The IPCC also seeks a balance in gender, as well as between those experienced with working on IPCC reports and those new to the process, including younger scientists.

During the 60th Session of the IPCC in January 2024, the Panel agreed to continue to prepare a comprehensive assessment report and to maintain the current Working Group structure where Working Group I assesses the scientific aspects of the climate system and climate change; Working Group II looks at impacts, adaptation, and vulnerability to climate change, and Working Group III assesses the mitigation of climate change.

The outlines of the three Working Group contributions to the AR7 were developed after a comprehensive scientific scoping meeting in December 2024 in Kuala Lumpur, Malaysia before the Panel considered them and agreed upon them at the end of February.

Those interested in being nominated as a Coordinating Lead Author, a Lead Author or a Review Editor should contact their relevant Focal Point. A list of Focal Points for IPCC member governments and observer organizations is available here.

Nominations are submitted through a dedicated online nomination tool by Focal Points in governments and accredited observer organizations, as well as IPCC Bureau Members.

Governments, Observer Organisations, and IPCC Bureau Members have been requested to submit their nominations by Thursday 17 April 2025 (midnight CEST).

More information on the nomination process is here and how the IPCC selects its authors is available here.

For more information, contact:
IPCC Press Office, Email: ipcc-media@wmo.int;
Andrej Mahecic, +41 22 730 8516; Werani Zabula, +41 22 730 8120.

View Event →
Scientists Warn: Greenhouse Emissions Could Push Low Earth Orbit to the Brink of Collapse
Mar
11
11:00 AM11:00

Scientists Warn: Greenhouse Emissions Could Push Low Earth Orbit to the Brink of Collapse

  • The Oregon Institute for Creative Research (map)
  • Google Calendar ICS

Scientists Warn: Greenhouse Emissions Could Push Low Earth Orbit to the Brink of Collapse | The Daily Galaxy --Great Discoveries Channel

Lydia Amazouz
Published on March 11, 2025

The growing release of greenhouse gases into the atmosphere could pose a serious threat to the future of space operations, especially in low Earth orbit (LEO). A recent study published in Nature explores the potential consequences of increased emissions on the capacity of LEO to support satellite operations. The study highlights the risks posed by space junk, climate change, and orbital debris accumulation, which, together, could disrupt one of humanity’s most valuable technological frontiers.

The research reveals that emissions have a direct effect on the thermosphere, a layer of Earth’s atmosphere located between altitudes of 85 to 600 kilometers. This region plays a critical role in satellite drag, which can either slow satellites down and cause them to re-enter Earth’s atmosphere or keep them in orbit. As emissions increase, the thermosphere shrinks, leading to reduced drag on satellites and increasing the longevity of space debris. This, in turn, exacerbates the issue of overcrowding in low Earth orbit, making it harder for new satellites to operate safely.

How Greenhouse Emissions Affect Satellite Operations

The new study shows that the effects of greenhouse gas emissions could drastically reduce the space available for satellite operations in low Earth orbit by the end of the century. The researchers modeled the situation under different emissions scenarios, and the results were alarming: By 2100, under moderate to high emissions scenarios, the capacity for satellites in altitudes ranging from 400 to 1,000 kilometers could be reduced by up to 82%. This scenario could limit the number of satellites that can operate in LEO, especially during solar minimum periods.

Greenhouse gases, primarily carbon dioxide, influence the thermosphere’s density, which plays a key role in atmospheric drag. As the thermosphere becomes less dense due to the effects of greenhouse emissions, drag on satellites decreases, allowing them to remain in orbit much longer than they otherwise would. While this may seem beneficial for operational satellites, it poses significant problems for defunct ones.

Satellites are designed to gradually lose altitude due to drag, eventually re-entering Earth’s atmosphere, where they burn up. However, as drag decreases, this natural process takes longer, leaving defunct satellites lingering in orbit and contributing to the growing debris problem. This makes the environment in low Earth orbit more hazardous, complicating the operation of new satellites and increasing the risk of collisions. As the study’s lead author, William Parker from MIT, emphasizes:

“Climate change and orbital debris accumulation are two pressing issues of inextricable global concern requiring unified action.”

The Unpredictable Future of Low Earth Orbit

The study highlights the fragility of low Earth orbit and the risks posed by increased emissions. As more satellites are launched into orbit, the problem of overcrowding becomes more serious. Currently, about 11,901 satellites are operational in orbit, with an additional 20,000 pieces of space debris. While we are far from reaching the critical point where Kessler syndrome occurs, scientists warn that continued emissions could push us dangerously close to that threshold.

The expansion of satellite constellations, such as those deployed by companies like SpaceX, adds to the challenge of managing space debris and maintaining a safe environment in low Earth orbit. Even as technological advances improve our ability to track and monitor debris, the sheer volume of objects in orbit makes collision events increasingly likely. These collisions could result in more debris, creating an uncontrollable cycle of space junk accumulation that would threaten future space operations.

As Parker and his colleagues argue in the study:

“Understanding and respecting the influence that the natural environment has on our collective ability to operate in low Earth orbit is critical to preventing the exploitation of this regime and protecting it for future generations.”

Read more here

View Event →