Anomalo, Snowflake partner to help enterprises trust their data

Anomalo has announced a partnership with Snowflake to help customers trust the data they use to make decisions and build products. The combination provides customers with a way to monitor the quality of the data in any table in Snowflake’s platform without writing code, configuring rules, or setting thresholds.

Today’s modern data-powered organizations are using Snowflake’s platform to centralize all of their data and make it easily available for everything from business decision-making to predictive analytics and machine learning.

However, dashboards and data-powered products are only as good as the quality of the data that powers them. Many data-powered companies quickly encounter one unfortunate fact: much of their data is missing, stale, corrupt, or prone to unexpected and unwelcome changes. As a result, companies spend more time dealing with issues in their data rather than unlocking that data’s value.

Anomalo thus addresses the data quality problem by monitoring enterprise data and automatically detecting and root-causing data issues, allowing teams to resolve any hiccups with their data before making decisions, running operations, or powering models. Anomalo leverages machine learning to rapidly assess a wide range of data sets with minimal human input. If desired, enterprises can fine-tune Anomalo’s monitoring through the low-code configuration of metrics and validation rules. This is in contrast to legacy approaches to monitoring data quality that require extensive work writing data validation rules or setting limits and thresholds.

As a result, Snowflake customers can now begin monitoring the quality of their data with Anomalo in under five minutes. They simply connect Anomalo’s data quality platform to their Snowflake account and select the tables they wish to monitor. No further configuration or code is required.

Anomalo and Snowflake are used by customers globally:

  • Discover Financial Services is leveraging Anomalo to quickly gain trust in their most critical data. Discover’s Chief Data and Analytics Officer Keith Toney said: “Discover is transforming and expanding how we use data as an enterprise asset to serve our customers better through advanced data analytics. We were looking for a product that would help us maintain a scalable foundation of trusted data in a fast-paced digital environment. We selected Anomalo to fully automate the basis of our data quality monitoring because their machine learning and root cause detection technology identify late, missing, or anomalous data across our petabyte-scale cloud warehouse. Our data stewards use Anomalo’s intuitive UI to tailor monitoring to their business needs. Compared to legacy solutions, Anomalo will help us detect more quality issues with just a fraction of the time invested by our team.”
  • Faire uses Anomalo to monitor the most important tables in their Snowflake account. Daniele Perito, Chief Data Officer and co-founder at Faire, said: “We monitor hundreds of key tables in Snowflake’s platform with Anomalo. I sleep better at night knowing our data is more reliable, and my team loves how easy it is to use and how insightful the notifications are.”
  • Substack uses Anomalo to empower their small team to keep up with an ever-growing collection of data. Mike Cohen, Substack’s Data Manager, said: “With a small data team at Substack, the automated checks that Anomalo provides are like having another data engineer on the team whose primary focus is to ensure data quality and integrity. With these checks, we've caught internal data and production bugs and detected the presence of bad actors internal to our system that might have otherwise gone unnoticed for long periods.”

“Snowflake provides an ideal environment for tools like Anomalo. With its ability to centralize the full set of enterprise data and its unique ability to automatically size query workloads based on their priority and urgency, Snowflake is a perfect partner in helping enterprises trust all of their important data,” said Elliot Shmukler, co-founder and CEO of Anomalo.

“Anomalo offers an easy-to-use way to monitor every table in a customer’s Snowflake account for data quality issues," said Tarik Dwiek, Head of Technology Alliances at Snowflake. "We're excited to offer Snowflake customers the ability to leverage Anomalo to further build trust in the data they are using to develop products and make decisions.”

As part of today’s announcement, Anomalo is a Select Partner within the Snowflake Partner Program.

UK supercomputing reveals more hostile conditions on Earth as life evolved

During long portions of the past 2.4 billion years, the Earth may have been more  inhospitable to life than scientists previously thought, according to new supercomputer simulations.  Graphic showing how UV radiation on Earth has changed over the last 2.4 billion years. Credit Gregory Cooke.

Using a state-of-the-art climate model, researchers now believe the level of ultraviolet (UV) radiation reaching the Earth’s surface could have been underestimated, with UV levels being up to ten times higher.   

UV radiation is emitted by the sun and can damage and destroy biologically important molecules such as proteins.  

The last 2.4 billion years represent an important chapter in the development of the biosphere. Oxygen levels rose from almost zero to significant amounts in the atmosphere, with concentrations fluctuating but eventually reaching modern-day concentrations approximately 400 million years ago.   

During this time, more complex multicellular organisms and animals began to colonize the land.   

Gregory Cooke, a Ph.D. researcher at the University of Leeds who led the study, said the findings raise new questions about the evolutionary impact of UV radiation as many forms of life are known to be negatively affected by intense doses of UV radiation.  

He said: “We know that UV radiation can have disastrous effects if life is exposed to too much. For example, it can cause skin cancer in humans. Some organisms have effective defense mechanisms, and many can repair some of the damage UV radiation causes.  

“Whilst elevated amounts of UV radiation would not prevent life’s emergence or evolution, it could have acted as a selection pressure, with organisms better able to cope with greater amounts of UV radiation receiving an advantage.”    

The amount of UV radiation reaching the Earth is limited by the ozone in the atmosphere, described by the researchers as “...one of the most important molecules for life” because of its role in absorbing UV radiation as it passes into the Earth’s atmosphere. 

Ozone forms as a result of sunlight and chemical reactions – and its concentration is dependent on the level of oxygen in the atmosphere.  

For the last 40 years, scientists have believed that the ozone layer was able to shield life from harmful UV radiation when the level of oxygen in the atmosphere reached about one percent relative to the present atmospheric level.  

The new modeling challenges that assumption. It suggests the level of oxygen needed may have been much higher, perhaps 5% to 10% of present atmospheric levels.  A rough outline of oxygen (O2) concentrations in Earth's atmosphere through time are illustrated in this figure. Brown blocks show the estimated range for O2 in terms of its present atmospheric level (which is 21% by volume). Grey-blue lines indicated various important events for the evolution of life, including the emergence of eukaryotes and animals. Black arrows refer to important events where atmospheric oxygen concentration changed. The Archean, Proterozoic, and Phanerozoic are geological eons. GOE = Great Oxidation Event; NOE = Neoproterozoic Oxidation Event; CE = Cambrian Explosion; LE = Lomagundi Excursion. credit Gregory Cooke

As a result, there were periods when UV radiation levels at the Earth’s surface were much greater, and this could have been the case for most of the Earth’s history. 

Mr. Cooke said: “If our modeling is indicative of atmospheric scenarios during Earth’s oxygenated history, then for over a billion years the Earth could have been bathed in UV radiation that was much more intense than previously believed. 

“This may have had fascinating consequences for life’s evolution. It is not precisely known when animals emerged, or what conditions they encountered in the oceans or on land. However, depending on oxygen concentrations, animals and plants could have faced much harsher conditions than today’s world. We hope that the full evolutionary impact of our results can be explored in the future.”   

The results will also lead to new predictions for exoplanet atmospheres. Exoplanets are planets that orbit other stars. The presence of certain gases, including oxygen and ozone, may indicate the possibility of extra-terrestrial life, and the results of this study will aid in the scientific understanding of surface conditions in other worlds. 

CfA astronomer Karen Collins uses ML to discover mysterious dusty object orbiting TIC 400799224

The Transiting Exoplanet Survey Satellite, TESS, was launched in 2018 to discover small planets around the Sun’s nearest neighbor stars. TESS has so far discovered 172 confirmed exoplanets and compiled a list of 4703 candidate exoplanets. Its sensitive camera takes images that span a huge field of view, more than twice the area of the constellation of Orion, and TESS has also assembled a TESS Input Catalog (TIC) with over 1 billion objects. Follow-up studies of TIC objects have found they result from stellar pulsations, shocks from supernovae, disintegrating planets, gravitational self-lensed binary stars, eclipsing triple star systems, disk occultations, and more. An optical/near-infrared image of the sky around the TESS Input Catalog (TIC) object TIC 400799224 (the crosshair marks the location of the object, and the width of the field of view is given in arcminutes). Astronomers have concluded that the mysterious periodic variations in the light from this object are caused by an orbiting body that periodically emits clouds of dust that occult the star.  Powell et al., 2021

The Center for Astrophysics | Harvard & Smithsonian (CfA) astronomer Karen Collins was a member of a large team that discovered the mysterious variable object TIC 400799224. They searched the Catalog using machine-learning-based computational tools developed from the observed behaviors of hundreds of thousands of known variable objects; the method has previously found disintegrating planets and bodies that are emitting dust, for example. The unusual source TIC 400799224 was spotted serendipitously because of its rapid drop in brightness, by nearly 25% in just a few four hours, followed by several sharp brightness variations that could each be interpreted as an eclipse.

The astronomers studied TIC 400799224 with a variety of facilities including some that have been mapping the sky for longer than TESS has been operating. They found that the object is probably a binary star system and that one of the stars pulsates with 19.77 days, probably from an orbiting body that periodically emits clouds of dust that occult the star. But while the periodicity is strict, the dust occultations of the star are erratic in their shapes, depths, and durations, and are detectable (at least from the ground) only about one-third of the time or less. The nature of the orbiting body itself is puzzling because the quantity of dust emitted is large; if it were produced by the disintegration of an object like the asteroid Ceres in our solar system, it would survive only about eight thousand years before disappearing. Yet remarkably, over the six years that this object has been observed, the periodicity has remained strict and the object emitting the dust has remained intact. The team plans to continue monitoring the object and to incorporate historical observations of the sky to try to determine its variations over many decades.

Yale physicists build simulations that show hurricanes will roam over more of the Earth

A new, Yale-led study suggests the 21st century will see an expansion of hurricanes and typhoons into mid-latitude regions, which includes major cities such as New York, Boston, Beijing, and Tokyo. © stock.adobe.com

The researchers said tropical cyclones, hurricanes and typhoons, could migrate northward and southward in their respective hemispheres, as the planet warms as a result of anthropogenic greenhouse gas emissions. 2020’s subtropical storm Alpha, the first tropical cyclone observed making landfall in Portugal, and this year’s Hurricane Henri, which made landfall in Connecticut, maybe harbingers of such storms.

“This represents an important, under-estimated risk of climate change,” said first researcher Joshua Studholme, a physicist in Yale’s Department of Earth and Planetary Sciences in the Faculty of Arts and Sciences, and a contributing author on the United Nations’ Intergovernmental Panel on Climate Change sixth assessment report published earlier this year.

“This research predicts that the 21st century’s tropical cyclones will likely occur over a wider range of latitudes than has been the case on Earth for the last 3 million years,” Studholme said.

Co-authors of the study are Alexey Fedorov, a professor of oceanic and atmospheric sciences at Yale, Sergey Gulev of the Shirshov Institute of Oceanology, Kerry Emanuel of the Massachusetts Institute of Technology, and Kevin Hodges of the University of Reading.

While an increase in tropical cyclones is commonly cited as a harbinger of climate change, much remains unclear about how sensitive they are to the planet’s average temperature. In the 1980s, study co-author Emanuel used concepts from classical thermodynamics to predict that global warming would result in more intense storms — a prediction that has been validated in the observational record.

Yet other aspects of the relationship between tropical cyclones and climate still lack physically based theory. For example, there is no agreement among scientists about whether the total number of storms will increase or decrease as the climate warms, or why the planet experiences roughly 90 such events each year.

“There are large uncertainties in how tropical cyclones will change in the future,” said Fedorov. “However, multiple lines of evidence indicate that we could see more tropical cyclones in mid-latitudes, even if the total frequency of tropical cyclones does not increase, which is still actively debated. Compounded by the expected increase in average tropical cyclone intensity, this finding implies higher risks due to tropical cyclones in Earth’s warming climate.”

Typically, tropical cyclones form at low latitudes that have access to warm waters from tropical oceans and away from the shearing impact of the jet streams — the west-to-east bands of wind that circle the planet. Earth’s rotation causes clusters of thunderstorms to aggregate and spins up to form the vortices that become tropical cyclones. Other mechanisms of hurricane formation also exist.

As the climate warms, temperature differences between the Equator and the poles will decrease, the researchers say. In the summer months, this may cause weakening or even a split in the jet stream, opening a window in the mid-latitudes for tropical cyclones to form and intensify.

For the study, Studholme, Fedorov, and their colleagues analyzed numerical simulations of warm climates from Earth’s distant past, recent satellite observations, and a variety of weather and climate projections, as well as the fundamental physics governing atmospheric convection and planetary-scale winds. For example, they noted that supercomputer simulations of warmer climates during the Eocene (56 to 34 million years ago) and Pliocene (5.3 to 2.6 million years ago) epochs saw tropical cyclones form and intensify at higher latitudes.

“The core problem when making future hurricane predictions is that models used for climate projections do not have sufficient resolution to simulate realistic tropical cyclones,” said Studholme, who is a postdoctoral fellow at Yale. “Instead, several different, indirect approaches are typically used. However, those methods seem to distort the underlying physics of how tropical cyclones form and develop. A number of these methods also provide predictions that contradict each other.”

The new study derives its conclusions by examining connections between hurricane physics on scales too small to be represented in current climate models and the better-simulated dynamics of Earth’s jet streams and north-south air circulation, known as the Hadley cells.

Danes minimize phase noise using ML to improve optical systems

Ultra-precise lasers can be used for optical atomic clocks, quantum supercomputers, power cable monitoring, and much more. But all lasers make noise, which researchers from DTU Fotonik want to minimize using machine learning. Professor Darko Zibar lab

The perfect laser does not exist. There will always be a bit of phase noise because the laser light frequency moves back and forth a little. Phase noise prevents the laser from producing light waves with the perfect steadiness that is otherwise a characteristic feature of the laser.

Most of the lasers we use daily do not need to be completely precise. For example, it is of no importance whether the frequency of the red laser light in the supermarket barcode scanners varies slightly when reading the barcodes. But for certain applications—for example in optical atomic clocks and optical measuring instruments—the laser must be stable so that the light frequency does not vary.

One way of getting closer to an ultra-precise laser is if you can determine the phase noise. This may enable you to find a way of compensating for it so that the result becomes a purer and more accurate laser beam.

This is precisely what Professor Darko Zibar from DTU Fotonik is working on. He heads a research group called Machine Learning in Photonic Systems, where the goal is to develop and utilize machine learning to improve optical systems. Most recently, researchers from the group have characterized the noise from a laser system from the Danish company NKT Photonics with unprecedented precision.

“The question is how to measure that noise, and here we’ve developed the most accurate method available. We can measure much more precisely than others—our method has record-high sensitivity,” says Darko Zibar.

He has developed an algorithm that can analyze and find laser light patterns using machine learning, where a model for the noise is constantly being improved. On this basis, the group of researchers hopes to be able to develop a form of intelligent filter that continuously cleans the laser beam of noise.

Quantum mechanics set the limit

This is something that NKT Photonics can utilize in their optical measuring instruments, says Senior Researcher Poul Varming and his colleague Jens E. Pedersen, who has worked with the DTU researchers:

“We work with fiber lasers that emit constant light, and where the noise level is particularly low. Our most important task is to limit the noise, and—in terms of measuring technology—we had difficulty measuring noise at very high frequencies,” says Poul Varming and continues:

“But then we got in touch with Darko Zibar and his group, and we produced some lasers for them. The researchers were able to measure the noise up to very high frequencies, and the results actually contradict the established understanding of laser noise.”

With the new, improved measuring method, the researchers could thus show that the theoretical basis for calculating the noise was not quite in place. With the more detailed knowledge of the noise, engineers can better identify the parts of the laser system from which the noise emanates so that they know where to make improvements. The hope is that the machine learning system can also be used to attenuate the noise in real-time.

You cannot eliminate noise, because the laws of quantum mechanics set a very fundamental limit to how good a laser can be. Quantum noise is impossible to get rid of, but now it can at least be measured, says Darko Zibar:

“We can measure in the frequencies in which quantum noise is dominant. In this way, we can determine the fundamental noise and find out how much it contributes to the total noise. Once we know the fundamental limit for how good the laser can be, we can then figure out how to suppress the rest of the noise.”

“This is our next project—how we first identify and then suppress the noise, to obtain a laser that is only limited by quantum noise. This will enable us to produce some of the best lasers in the world.”

Optical cable feels vibrations

When the laser noise is known, it can be combated according to roughly the same principle used in noise-reducing headphones. Here, microphones pick up sound from the surroundings, and a signal is then sent in counter phase to the speakers so that the noise and the new signal eliminate each other, and the result is silence.

If the technique can be used to improve lasers by eliminating a large part of the noise so that the light frequency virtually does not vary, optical measuring instruments can have greater sensitivity and a longer range. At NKT Photonics, the technology can initially be used for distributed acoustic sensing, where a fiber optic cable is used as a sensor for measuring tiny vibrations. Distributed acoustic sensing can be used for various forms of monitoring. For example, an optical fiber can be laid along an oil or gas pipeline to ensure ultrafast detection of any ruptures. Or the technology can be used to monitor the fence around an airport or at a border—if a hole is cut in the fence, or someone tries to climb over it—the technology can not only signal what has happened but also pinpoint where it has occurred.

Such an optical monitoring system functions by a laser beam being sent into the optical fiber. During the process, a bit of the light is reflected by tiny impurities in the fiber. However, if the fiber is affected along the way, the properties of the reflected light also change, which is measurable. Even very faint vibrations can be picked up and located with great accuracy.

Monitoring of cables to the energy islands

If the new technology from DTU provides more effective laser light noise attenuation, distributed acoustic sensing can be used over somewhat longer distances than today. Both the sensitivity and the range of distributed acoustic sensing can be increased with the more precise lasers, and this may—for example—be needed when electricity is to be transported from the coming energy islands in the North Sea to the mainland. Here, the power cables can be monitored using the technology, so that any ruptures can be detected and repaired quickly. Today, it is a challenge that the range of the current systems is limited to a maximum of 50 km, and the distance to the energy island will be somewhat longer.

Poul Varming also mentions that several quantum technologies require extremely precise lasers. With noise-attenuated lasers, it becomes easier to develop ultra-precise optical atomic clocks and certain types of quantum computers, where lasers are used to cool individual atoms to close to absolute zero. The new generation of laser systems that may be the result of the researchers’ and engineers’ work thus offers great potential.