Glasgow sets its sights on 'cognitive' cities, where urban systems learn, predict, adapt

Imagine a city capable of sensing trouble before it occurs, anticipating traffic jams, worsening air quality, infrastructure strain, or emerging health concerns, much like a living organism responds to its surroundings. According to a new research center at the University of Glasgow, this vision is closer to reality than many realize.
 
Launched this week, the Centre for Integrated Sensing and Communication Enabling Cognitive Cities (ISAC³) is set to explore how next-generation digital technologies, particularly 6G communications, artificial intelligence, and large-scale data analytics, can transform today’s “smart cities” into cognitive ones. Unlike current urban systems, which primarily monitor conditions in real-time, cognitive cities aim to predict and adapt, shifting city management from a reactive response to proactive decision-making.
 
At the heart of this ambition lies data, vast volumes of it. Future 6G networks are expected to collect streams of information from advanced sensors embedded throughout urban infrastructure, as well as from next-generation mobile devices carried by residents. Processing, analyzing, and acting on this data in real time will demand not just clever algorithms, but significant computational power, placing high-performance computing and AI-driven analytics at the core of the cognitive city concept.
 
Professor Qammer H. Abbasi, Founding Director of ISAC³ and Professor at the University of Glasgow’s James Watt School of Engineering, describes the initiative as a response to mounting global pressures on cities. Population growth, climate change, cybersecurity risks, and the push toward decarbonization are converging challenges that traditional urban planning struggles to address in isolation.
 
“Next-generation technologies like real-time data collection, advanced communications, cyber-physical systems, and AI-driven analytics will provide the tools required to turn urban spaces into cognitive cities,” Abbasi said. “ISAC³ brings together the expertise needed to explore how these tools can work together responsibly and at scale.”
The center unites researchers across engineering, computing science, cybersecurity, public health, business, social science, and urban planning, reflecting the inherently interdisciplinary nature of future cities. Cognitive systems, the researchers suggest, will require tightly coupled sensing, communication, computation, and action, an architectural challenge that mirrors the integrated workflows increasingly seen in modern supercomputing environments.
 
One intriguing aspect of ISAC³’s vision is its focus on health and well-being. Professor Frances Mair, Head of the University of Glasgow’s School of Health & Wellbeing, notes that healthcare services have often lagged behind other sectors in adopting advanced digital tools. Cognitive city technologies, she argues, could change that dynamic by identifying early warning signs of health risks and connecting people to support before conditions worsen.
 
"In the future, ISAC technology could work quietly in the background,” Mair said, “helping communities stay healthier, safer, and more supported."
 
Beyond technical innovation, ISAC³ is also emphasizing responsible and inclusive development. Professor Nuran Acur of the Adam Smith Business School highlighted the Centre’s “quadruple helix” approach, which brings together academia, industry, public services, and society from the earliest stages of research. The goal is to ensure that technologies are not only advanced but also practical, socially responsible, and ready for real-world deployment.
 
The center's first year will focus on building a deployment roadmap through workshops and webinars with international experts in integrated sensing, communication, and computing. A key question underpinning this work is how to balance innovation with robust data protection, particularly as cities collect increasingly sensitive information from sensors and personal devices.
 
Glasgow itself will serve as a living laboratory. The University’s campus will be used as a testbed for prototype systems, developed in collaboration with industry partners including BT, Virgin Media O2, Ericsson, InterDigital, and Neutral Wireless. According to Mallik Tatipamula, CTO of Ericsson Silicon Valley, the work underway in Glasgow could have global implications.
 
“Cities that succeed will be those that can sense, interpret, and respond to their environments in real time,” Tatipamula said. “ISAC³ has the potential to help redefine how future societies function.”
 
ISAC³ presents the supercomputing community with intriguing challenges: How can vast urban data streams be processed both efficiently and securely? What part will high-performance computing and AI accelerators play in facilitating real-time, citywide predictions? And how might computational models allow policymakers to test decisions virtually before they are implemented?
 
As ISAC³ embarks on its mission, it does not claim to possess all the solutions. Rather, it is establishing a research environment driven by curiosity, exploring how cities could learn, adapt, and evolve, and examining the computational backbone necessary to make these ambitions a reality. Through this work, Glasgow is setting itself apart as a hub where the future of urban life is not just envisioned, but rigorously investigated through computation.

Supercomputers illuminate deep Earth: How giant 'blobs' shape our magnetic shield

 
Researchers at the University of Liverpool and their collaborators have achieved a significant advance in understanding Earth's interior, leveraging the capabilities of modern supercomputing. For the first time, they have demonstrated how two massive, ultra-hot formations deep within Earth's mantle affect the creation and long-term dynamics of our planet's magnetic field, using sophisticated numerical models powered by high-performance computing.
 
The Earth’s magnetic field, the invisible shield that protects life from dangerous solar and cosmic radiation, is generated by the turbulent motion of molten iron in the outer core, a process known as the geodynamo. Understanding what governs the geodynamo’s behavior over millions of years requires not only sophisticated palaeomagnetic measurements from ancient rocks but also large-scale, three-dimensional simulations that test how variations deep within the planet affect core dynamics.
 
In their study, the research team combined palaeomagnetic datasets, which record changes in the magnetic field over geological time, with supercomputer-based dynamo simulations to reveal the importance of thermal heterogeneity at the core–mantle boundary. Two continent-sized regions of intensely hot rock, located roughly 2,900 km beneath Africa and the Pacific, sit atop Earth’s outer core and create strong, lateral temperature contrasts that profoundly influence the flow of molten iron below.
 
These “blobs,” known in geophysics as Large Low-Velocity Provinces (LLVPs) because they slow down seismic waves, were already observed by seismic imaging, but their significance for magnetic field generation had been unclear until now. By incorporating the effects of thermal heterogeneity into supercomputer models of the geodynamo, researchers found that these deep mantle structures help explain key features of Earth’s ancient and modern magnetic field, including the persistence of a dominant dipolar structure and subtle longitudinal variations that earlier homogeneous models could not reproduce.
 
Running these simulations is a remarkable computational feat. The equations that govern the interaction of heat, fluid motion, magnetic induction, and rotation in Earth’s core are exceptionally complex and demand high-performance computing (HPC) clusters with massive parallel processing capability. Even with today’s most powerful machines, exploring how the magnetic field evolves over hundreds of millions of years, and how it responds to boundary conditions set by deep mantle structures, represents an immense computational challenge.
 
According to Professor Andy Biggin of the University of Liverpool, “strong contrasts in the spatial pattern of core–mantle heat flux … have influenced the geodynamo for at least the last few hundred million years.” This implies that Earth’s magnetic field, while often approximated as a simple bar magnet aligned with the rotation axis, has subtle asymmetries imprinted by deep Earth processes that are only now coming into focus thanks to HPC-enabled modeling.
 
The implications of this research extend beyond geomagnetism. By demonstrating how deep mantle thermal structures influence core dynamics, these models offer a new framework for understanding the long-term evolution of the planet, including the connections between internal dynamics and surface phenomena such as continental assembly and breakup, climate shifts, and the formation of mineral resources. More accurate reconstructions of Earth’s ancient magnetic field also serve as essential constraints in palaeogeographical studies and plate-tectonic history.
 
For the supercomputing community, this breakthrough exemplifies how HPC is becoming indispensable to Earth sciences. Supercomputers are not merely accelerators of computation; they are exploratory instruments that allow scientists to build and test virtual Earths, probing regimes that cannot be accessed through direct observation or laboratory experiments. By enabling models that integrate data spanning hundreds of millions of years with high-resolution physics, supercomputers are transforming our understanding of the deep interior of the planet we call home.
 
With the ongoing growth of computational power, driven by larger HPC systems and improved algorithms, researchers can continue to enhance their models, broaden the range of conditions they examine, and incorporate the latest observational data. These advances will deepen our insights into the geodynamo, Earth’s thermal history, and the intricate connections between our planet’s interior and surface life.
 
According to the study’s authors, combining palaeomagnetic records with dynamo simulations introduces a “new means to constrain the properties and time evolution of the core–mantle boundary.” This approach provides a clearer perspective on the forces that have safeguarded life on Earth for millions of years. Thanks to supercomputing, we are now seeing a far more dynamic and interconnected Earth than previously imagined.

Supercomputers unravel the mystery of missing Tatooine-like planets

The idea of planets with twin suns, like Tatooine from Star Wars, has fascinated both scientists and the public for years. Despite binary stars being common throughout our galaxy, planets orbiting two stars (circumbinary planets) remain unexpectedly scarce. Researchers from the University of California, Berkeley, in collaboration with the American University of Beirut, now offer a compelling answer to this puzzle. Their explanation, grounded in Einstein’s general theory of relativity, emerged from sophisticated computational models powered by cutting-edge supercomputing technology.
 
Of the more than 6,000 confirmed exoplanets identified to date, only a handful orbit binary stars; a statistic that stands in stark contrast to expectations, given that stars commonly form in pairs. The team’s analysis shows that as tight binary stars spiral closer over millions of years due to tidal interactions, general relativistic precession, a subtle warping of spacetime predicted by Einstein, changes the dynamics of the entire system in a way that destabilizes potential planets.
 
Planets in a circumbinary orbit experience gravitational tugs from both stars. Under Newtonian physics alone, this complex interplay is already difficult for planets to navigate. But when the binary stars themselves begin to precess, that is, the orientation of their orbit rotates due to relativistic effects, the system can enter a state of secular resonance. At this point, the precession of the stars’ orbit matches that of the planet’s orbit, steadily pumping energy into the planet’s motion. Eventually, the planet’s path becomes highly elongated and chaotic. It can either be flung outward into interstellar space or drawn inward, where it risks destruction by one of its host stars.
 
This resonant disruption, described in the study “Capture into Apsidal Resonance and the Decimation of Planets around Inspiraling Binaries,” was elucidated through orbit-averaged simulations that explore the dynamical evolution of circumbinary systems under a range of initial conditions. These simulations, computationally intensive and numerically sophisticated, map out the phase space of binary–planet interactions and reveal how frequently planets are captured into destructive resonances as binaries tighten. According to the study, roughly eight out of every ten potential planets in tight binary systems encounter this resonance, and three out of four are eventually destroyed or ejected, leaving behind only a few survivors on distant, hard-to-detect orbits.
 
Such high-resolution modeling is intrinsically dependent on supercomputing capabilities. Simulating the long-term evolution of three-body systems, where two stars and a planet influence one another gravitationally, requires solving coupled differential equations with precision over billions of simulated years. Conventional computing alone is insufficient for this scale of calculation; only through HPC systems can researchers explore vast ensembles of scenarios, integrate relativistic effects accurately, and uncover the nuanced mechanisms that shape planetary destinies.
 
For the supercomputing community, this research offers both inspiration and affirmation of the critical role HPC plays in astrophysics. By enabling simulations that incorporate general relativity alongside classical dynamics, supercomputers open windows into processes that cannot be observed directly, but that govern the architecture of planetary systems throughout the galaxy. They allow scientists to test theoretical ideas against virtual models of reality, refining our understanding of how planets form, persist, or perish in the cosmos.
 
The Berkeley-led team clarifies that their findings do not mean binary stars are devoid of planets. Instead, they show that while planets often form around binary stars, most are pushed into orbits that current detection tools, including NASA’s Kepler and TESS, struggle to find. A few planetary survivors may remain, hidden in distant, long-period orbits that will require innovative search techniques to uncover.
 
Looking ahead, researchers plan to apply similar modeling techniques to other astrophysical contexts, such as the environments around pairs of supermassive black holes, to understand how relativistic dynamics influence large-scale cosmic structures. In doing so, they continue to push the boundaries of computational astrophysics, using supercomputers not just as tools for calculation but as engines of discovery in the quest to understand our universe.