World’s most powerful solar telescope reaches historic milestone with first data online

On Wednesday, February 23, 2022, the U.S. National Science Foundation’s (NSF’s) Daniel K. Inouye Solar Telescope (Inouye Solar Telescope) commenced its first science observations, signaling the start of its year-long operations commissioning phase and a new era of solar science. Over 25 years in the making, the world’s most powerful solar telescope is now poised to revolutionize our understanding of the Sun and its impacts on Earth. Inouye Solar Telescope with open aperture: Inouye Solar Telescope with an open aperture and blue skies near Haleakalā summit, Maui, HI.

“We are proud to bring the world’s largest and most powerful solar telescope online,” said Dr. Sethuraman Panchanathan, NSF Director. “The NSF’s Daniel K. Inouye Solar Telescope is a modern technological marvel, named in honor of late Senator Inouye, an American hero, and leader dedicated to scientific research and discovery.”

Hailed as the “crowning achievement” for ground-based solar astronomy, the Inouye Solar Telescope utilizes the next generation of solar observing instruments capable of capturing images of the Sun in unprecedented detail. The facility operates at 10,000 ft above sea level near the summit of Haleakalā on Maui, Hawai‘i, where unique environmental conditions enable observations of the elusive solar corona. The telescope’s operational phase is a long-awaited accomplishment, marking the end of a construction phase bookended by groundbreaking in 2012 and an 18-month delay caused by the COVID-19 global pandemic.

“The Inouye Solar Telescope team remained committed to developing an innovative solar telescope that pushed the frontiers of new technology. From design through construction, they overcame many challenges to realize a world-class facility poised to deliver on its transformational potential for all of humankind,” said Dr. David Boboltz, Program Director in NSF's Division of Astronomical Sciences.

The Inouye Solar Telescope will take high-resolution images and make measurements of the magnetic fields of solar phenomena including sunspots, solar flares, and coronal mass ejections. Solar activity drives space weather events that can impact Earth by disrupting power grids, communication networks, and other technology we depend on. The Inouye Solar Telescope, in concert with other advanced observatories, will provide greater insight into space weather behavior to aid in developing the means of predicting such events.

“Taking the first science observations with the Inouye Solar Telescope marks an exciting moment for the solar science community,” commented Dr. Thomas Rimmele, NSO Associate Director and lead of Inouye Solar Telescope, “There is no other facility like the Inouye Solar Telescope. It is now the cornerstone of our mission to advance our knowledge of the Sun by providing forefront observational opportunities to the research community. It is a game-changer.”

In 2020, the Inouye Solar Telescope’s “First Light” campaign previewed the powerful optical systems by producing the highest resolution image of the solar surface ever taken, followed six months later by the clearest image ever taken of a sunspot. The facility combines numerous state-of-the-art systems including a 4-meter primary mirror, adaptive optics to correct for the effects of the atmosphere, active cooling of all telescope optics, and advanced optical and infrared instrumentations. The combination of instruments is capable of many feats, ranging from capturing images of our Sun's features three times smaller than previously recorded to facilitating regular measurements of the solar corona’s magnetic fields for the very first time. 

This landmark event has been eagerly anticipated by the global solar community. Operations begin with a 12-month commissioning phase as the telescope is gradually brought online. This is a period of “shared risk” where scientists understand that there may be issues to be solved while they exercise the many observing modes of this complex facility, its instruments, and data systems. 

The Inouye Solar Telescope’s resident scientists and operators will execute selected scientific observations, chosen via a peer-reviewed proposal process based on the experiment’s objectives, viewing conditions, and available solar features. The Visible Broadband Imager (developed by the NSO), the Visible Spectro-polarimeter (developed by the High Altitude Observatory), and the Cryogenic Near-Infrared Spectro-polarimeter (developed by the University of Hawai‘i) will be available to the initial phase of scientific proposals. A second infrared spectro-polarimeter, also developed by the University of Hawai‘i, will be made available during the second proposal call expected to be announced early next year. 

The selected science proposal titled, “Electric Field Associated with Magnetic Reconnection Driving a Jet in the Chromosphere” was led by Dr. Tetsu Anan, Principal Investigator with the National Solar Observatory.

The experiment is designed to verify a process known as “magnetic reconnection” by measuring electric fields that are believed to occur during this process. Magnetic reconnection is the mechanism by which solar magnetic fields are suddenly and energetically reconfigured, resulting in jets of plasma ejected from the solar atmosphere. This process has long been theorized but has yet to be proven. Observations from the Inouye Solar Telescope’s unique suite of instruments are allowing scientists to observe this elusive but vital phenomenon for the very first time.

"It is an honor to have been selected as the first science experiment executed at the Inouye Solar Telescope," said Anan, "This a moment we've all looked forward to - a historic welcoming to the new age of solar observations. I’d like to thank the co-investigators and everyone involved with the Inouye Solar Telescope for this monumental milestone.” 

Dr. Jiong Qiu, Montana State University, was one of several co-investigators involved in the experiment. “Magnetic reconnection is the keyword in many energy release events in the Sun's atmosphere,” said Qiu, “For many years, solar physicists could only infer or estimate an average reconnection electric field based on many assumptions. I am hopeful that being able to directly measure this crucial physical parameter with the enabling technology by the Inouye Solar Telescope will bring breakthroughs in solar physics and revolutionize our understanding of magnetic reconnection.”

NSF’s Daniel K. Inouye Solar Telescope Data Center in Boulder, CO is an integral part of the telescope’s operations. The Data Center will receive the data from the telescope, curate and store the data, calibrate it, and distribute it to scientists and the public. Once data from a proposal has been calibrated, the principal investigators & co-investigators will have exclusive access to the data for six months, after which it will be made publicly available.

Missouri S&T researcher uses visible light for Wi-Fi transmission

Turning a light on and off doesn’t require much thought – click on, click off. But modulating that light – turning it on and off faster than the human eye can comprehend – and using the modulated light for Wi-Fi data transmission requires a great deal of thought, and it’s the focus on Dr. Nan Cen’s visible-light communications research. Dr. Nan Cen’s research uses visible light to provide high-speed Wi-Fi data transfer. Photo by Michael Pierce, Missouri S&T.

“An advantage of visible-light communication is the largely unregulated spectrum ranging from 375 terahertz to 750 terahertz, which would provide higher data-rate communication than current technologies,” says Cen, an assistant professor of computer science at Missouri S&T. “Another advantage is simplicity – using basic photodetectors to receive the data from standard room lights. The technology is also inherently secure because it’s directional and can be confined within a room.”

Cen says there currently isn’t enough spectrum to handle the growth of interconnected devices within the Internet of Things. She adds that more technology is needed to process the higher data rates available with visible light communication, but she believes that devices could be equipped with light sensors in the next 10-20 years. That would be a boon in rural areas where providing broadband access presents particular challenges.

“Many rural residents use a Wi-Fi hot spot for connectivity, but data speed is slow,” says Cen. “Telephone lines also provide slow data rates, and fiber-optic installation is expensive.”

Cen is researching the possibility of using lasers between power poles in rural areas, with solar panels acting as the receiver for homes. Instead of a router, standard household lamps would transmit data. Fascinating work, but Cen is one of only a few U.S. researchers working on visible light communication.

“Most U.S. researchers are focused on the terahertz band,” she says. “Current terahertz-band communication is radio-frequency based and visible-light communication is not, although there are some common characteristics.”

Cen says a drawback of visible light communication is its short propagation range. Using lasers, the range could be several kilometers, but with regular household light, the range is a few meters. Visible light is also easy to block. Cen says researchers may use novel algorithms to enhance the propagation range and propose new technologies to overcome the blockage.

Applications could range from transportation to virtual reality to medical settings where electromagnetic interference limits contact using radiofrequency. Eventually, drones could provide visible light communication to remote or extreme environments where infrastructure doesn’t exist. But Cen says the U.S. needs more researchers in the area plus industry interest to produce compatible devices.

“Visible light communication has a broad range of applications,” she says. We need to demonstrate that this is feasible technology that we definitely need.”

NC State researchers use AI to predict flood damage risk

In a new study, North Carolina State University researchers used artificial intelligence to predict where flood damage is likely to happen in the continental United States, suggesting that recent flood maps from the Federal Emergency Management Agency do not capture the full extent of flood risk. Map of the United States showing predicted average flood damage risk by state or district. Credit: Elyssa Collins.

In the study, published in Environmental Research Letters, researchers found a high probability of flood damage – including monetary damage, human injury and loss of life – for more than a million square miles of land across the United States across a 14-year period. That was more than 790,000 square miles greater than flood risk zones identified by FEMA’s maps.

“We’re seeing that there’s a lot of flood damage being reported outside of the 100-year floodplain,” said the study’s lead author Elyssa Collins, a doctoral candidate in the NC State Center for Geospatial Analytics. “There are a lot of places that are susceptible to flooding, and because they’re outside the floodplain, that means they do not have to abide by insurance, building code and land-use requirements that could help protect people and property.”

It can cost FEMA as much as $11.8 billion to create national Flood Insurance Rate Maps, which show whether an area has at least a 1% chance of flooding in a year, according to a 2020 report from the Association of State Floodplain Managers. Researchers say their method of using machine learning tools to estimate flood risk offers a way of rapidly updating flood maps as conditions change or more information becomes available.

“This is the first spatially complete map of flood damage probability for the United States; wall-to-wall information that can be used to learn more about flood risk in vulnerable, underrepresented communities,” said Ross Meentemeyer, Goodnight Distinguished Professor of Geospatial Analytics at NC State.

To create their computer models, researchers used reported data of flood damage for the United States, along with other information such as whether land is close to a river or stream, type of land cover, soil type and precipitation. The computer was able to “learn” from actual reports of damage to predict areas of high flood damage likelihood for each pixel of mapped land. They created separate models for each watershed in the United States.

“Our models are not based in physics or the mechanics of how water flows; we’re using machine learning methods to create predictions,” Collins said. “We developed models that relate predictors – variables related to flood damage such as extreme precipitation, topography, the relation of your home to a river – to a data set of flood damage reports from the National Oceanic and Atmospheric Administration. It’s very fast – our models for the U.S. watersheds ran on an average of five hours.”

The actual flood damage reports they used to “train” the models were publicly available reports from NOAA made between December 2006 and May of 2020. Compared with recent FEMA maps downloaded in 2020, 84.5% of the damage reports they evaluated were not within the agency’s high-risk flood areas. The majority, at 68.3%, were located outside of the high-risk floodplain, while 16.2% were in locations unmapped by FEMA.

When they ran their computer models to determine flood damage risk, they found a high probability of flood damage for more than 1.01 million square miles across the United States, while the mapped area in FEMA’s 100-year flood plain is about 221,000 square miles. Researchers said there are factors that could help explain why the differences were so large, including that their machine-learning-based model assessed damage from floods of any frequency, while FEMA only includes flooding that would occur from storms that have a 1% chance of happening in any given year.

“Potentially, FEMA is underestimating flood damage exposure,” Collins said.

One of the biggest drivers of flood damage risk was proximity to a stream, along with elevation and the average amount of extreme precipitation per year. The three Census regions with the highest probability were in the Southeast. Louisiana, Missouri, the District of Columbia, Florida and Mississippi had the highest risk of any U.S. state or district in the continental United States. Of the 30 most high-risk counties, North Carolina had three: Dare, Hyde and Tyrrell.

In their model, researchers used historical climate data. In the future, they plan to account for climate change.

In the meantime, researchers say their findings, which will be publicly accessible, could be useful for helping policymakers involved in land-use planning. They also represent a proof-of-concept method for efficiently updating flood maps in the future.

“There is still work to be done to make this model more dynamic,” Collins said. “But it’s part of a shift in thinking about how we approach these problems in a more cost-effective and computationally efficient manner. Inevitably, with climate change, we’re going to have to update these maps and models as events occur. It would be helpful to have future estimates that we can use to prepare for whatever is to come.”

UK researchers offer radical rethink of how to improve AI in the future

Computer scientists at the University of Essex have devised a radically different approach to improving artificial intelligence (AI) in the future.

Published in an academic journal – the Journal of Machine Learning Research – the Essex team hopes this research will provide a backbone for the next generation of AI and machine learning breakthroughs.

This could be translated to improvements in everything from driverless cars and smartphones having a better understanding of voice commands, to enhanced automatic medical diagnoses and drug discovery.

“Artificial intelligence research ultimately has the goal of producing completely autonomous and intelligent machines which we can converse with and will do tasks for us, and this newly published work accelerates our progress towards that,” explained co-lead author Dr. Michael Fairbank, from Essex’s School of Computer Science and Electronic Engineering.

The recent impressive breakthroughs in AI around vision tasks, speech recognition, and language translation have involved "deep learning", which means training multi-layered artificial neural networks to solve a task.  However, training these deep neural networks is a computationally expensive task, requiring huge amounts of training examples and computing time.

What the Essex team, which includes Professor Luca Citi and Dr. Spyros Samothrakis, has achieved is to devise a radically different approach to training neural networks in deep learning.

“Our new method, which we call Target Space, provides researchers with a step-change in the way they can improve and build their AI creations,” added Dr. Fairbank. “Target Space is a paradigm-changing view, which turns the training process of these deep neural networks on its head, ultimately enabling the current progress in AI developments to happen faster.”

The standard way people train neural networks to improve performance is to repeatedly make tiny adjustments to the connection strengths between the neurons in the network. The Essex team has taken a new approach. So, instead of tweaking connection strengths between neurons, the new “target-space” method proposes to tweak the firing strengths of the neurons themselves. 

Professor Citi added: “This new method stabilizes the learning process considerably, by a process which we call cascade untangling. This allows the neural networks being trained to be deeper, and therefore more capable, and at the same time potentially requiring fewer training examples and less computing resources. We hope this work will provide a backbone for the next generation of artificial intelligence and machine-learning breakthroughs.”

The next steps for the research are to apply the method to various new academic and industrial applications.

Karolinska Institutet shows why natural killer cells react to COVID-19

 

Little has been known to date about how the immune system’s natural killer (NK) cells detect which cells have been infected with SARS-CoV-2. An international team of scientists led by researchers from Karolinska Institutet, ranked amongst the world's best medical schools, in Sweden now shows that NK cells respond to a certain peptide on the surface of infected cells. The study, which is published in Cell Reports, is an important piece of the puzzle in our understanding of how the immune system reacts to COVID-19.

NK cells are white blood cells that are part of the innate immune system. Unlike cells in the adaptive immune defense, they can recognize and kill cancer cells and virus-infected cells immediately without having encountered them before. This ability is controlled by a balance between the NK cells’ activating and inhibiting receptors, which can react to different molecules on the surface of other cells. NK cells are part of the innate immune system. Image: NIAID

The virus is revealed by a peptide

A new study shows why certain NK cells are activated when encountering a cell infected with SARS-CoV-2. The infected cells contain a peptide from the virus that triggers a reaction in NK cells that carry a particular receptor, NKG2A, able to detect the peptide.

“Our study shows that SARS-CoV-2 contains a peptide that is displayed by molecules on the cell surface,” says Quirin Hammer, a researcher at the Center for Infectious Medicine (CIM), Karolinska Institutet. “The activation of NK cells is a complex reaction, and here the peptide blocks the inhibition of the NK cells, which allows them to be activated. This new knowledge is an important piece of the puzzle in our understanding of how our immune system reacts in the presence of this viral infection.”

The study was a major collaboration between Karolinska Institutet, Karolinska University Hospital, and research laboratories and universities in Italy, Germany, Norway, and the USA. The first phase was to test their hypothesis using supercomputer simulations that were then confirmed in the laboratory. The decisive phase was the infection of human lung cells with SARS-CoV-2 in a controlled environment, whereupon the researchers could show that NK cells with the receptor in question are activated to a greater degree than the NK cells without it.

Monitoring new virus variants

“These findings are important to our understanding of how immune cells recognize cells infected with SARS-CoV-2,” says Dr. Hammer. “This may become significant when monitoring new virus variants to determine how well the immune system responds to them.”

The study is now being followed up with the help of a biobank at Karolinska University Hospital and Karolinska Institutet containing blood samples from over 300 people treated for COVID-19 during the first wave of the pandemic.

“We’ll be examining if the composition of NK cells a person has contributes to how severe their symptoms are when infected with SARS-CoV-2,” he continues.