Robert Klie, professor of physics. Photo: Jenny Fontaine

Researchers at the University of Illinois at Chicago describe a new technique for precisely measuring the temperature and behavior of new two-dimensional materials that will allow engineers to design smaller and faster microprocessors. Their findings are reported in the journal Physical Review Letters.

Newly developed two-dimensional materials, such as graphene -- which consists of a single layer of carbon atoms -- have the potential to replace traditional microprocessing chips based on silicon, which have reached the limit of how small they can get. But engineers have been stymied by the inability to measure how temperature will affect these new materials, collectively known as transition metal dichalcogenides, or TMDs.

Using scanning transmission electron microscopy combined with spectroscopy, researchers at UIC were able to measure the temperature of several two-dimensional materials at the atomic level, paving the way for much smaller and faster microprocessors. They were also able to use their technique to measure how the two-dimensional materials would expand when heated.

"Microprocessing chips in computers and other electronics get very hot, and we need to be able to measure not only how hot they can get, but how much the material will expand when heated," said Robert Klie, professor of physics at UIC and corresponding author of the paper. "Knowing how a material will expand is important because if a material expands too much, connections with other materials, such as metal wires, can break and the chip is useless."

Traditional ways to measure temperature don't work on tiny flakes of two-dimensional materials that would be used in microprocessors because they are just too small. Optical temperature measurements, which use a reflected laser light to measure temperature, can't be used on TMD chips because they don't have enough surface area to accommodate the laser beam.

"We need to understand how heat builds up and how it is transmitted at the interface between two materials in order to build efficient microprocessors that work," said Klie.

Klie and his colleagues devised a way to take temperature measurements of TMDs at the atomic level using scanning transition electron microscopy, which uses a beam of electrons transmitted through a specimen to form an image.

"Using this technique, we can zero in on and measure the vibration of atoms and electrons, which is essentially the temperature of a single atom in a two-dimensional material," said Klie. Temperature is a measure of the average kinetic energy of the random motions of the particles, or atoms that make up a material. As a material gets hotter, the frequency of the atomic vibration gets higher. At absolute zero, the lowest theoretical temperature, all atomic motion stops.

Klie and his colleagues heated microscopic "flakes" of various TMDs inside the chamber of a scanning transmission electron microscope to different temperatures and then aimed the microscope's electron beam at the material. Using a technique called electron energy-loss spectroscopy, they were able to measure the scattering of electrons off the two-dimensional materials caused by the electron beam. The scattering patterns were entered into a computer model that translated them into measurements of the vibrations of the atoms in the material - in other words, the temperature of the material at the atomic level.

"With this new technique, we can measure the temperature of a material with a resolution that is nearly 10 times better than conventional methods," said Klie. "With this new approach, we can design better electronic devices that will be less prone to overheating and consume less power."

The technique can also be used to predict how much materials will expand when heated and contract when cooled, which will help engineers build chips that are less prone to breaking at points where one material touches another, such as when a two-dimensional material chip makes contact with a wire.

"No other method can measure this effect at the spatial resolution we report," said Klie. "This will allow engineers to design devices that can manage temperature changes between two different materials at the nano-scale level."

CAPTION Nodes and lines represent the health-related variables and the strength of interdependence between two variables respectively. MENet helps build the Optimal Information Network (OIN) which indicates the most useful information to accurately characterize systemic health.  CREDIT Servadio J.L. and Convertino M, Science Advances, Feb. 2, 2018.

A new method could help researchers develop unbiased indicators for assessing complex systems such as population health.

Researchers have developed a complex system model to evaluate the health of populations in some U.S. cities based only on the most significant variables expressed in available data. Their unbiased network-based probabilistic approach to mine big data could be used to assess other complex systems, such as ranking universities or evaluating ocean sustainability.

Societies today are data-rich, which can both empower and overwhelm. Sifting through this data to determine which variables to use for the assessment of something like the health of a city's population is challenging. Researchers often choose these variables based on their personal experience. They might decide that adult obesity rates, mortality rates, and life expectancy are important variables for calculating a generalized metric of the residents' overall health. But are these the best variables to use? Are there other more important ones to consider?

Matteo Convertino of Hokkaido University in Japan and Joseph Servadio of the University of Minnesota in the U.S. have introduced a novel probabilistic method that allows the visualization of the relationships between variables in big data for complex systems. The approach is based on "maximum transfer entropy," which probabilistically measures the strength of relationships between multiple variables over time.

Using this method, Convertino and Servadio mined through a large amount of health data in the U.S. to build a "maximum entropy network" (MENet): a model composed of nodes representing health-related variables, and lines connecting the variables. The lines are darker the stronger the interdependence between two variables. This allowed the researchers to build an "Optimal Information Network" (OIN) by choosing the variables that had the most practical relevance for assessing the health status of populations in 26 U.S. cities from 2011 to 2014. By combining the data from each selected variable, the researchers were able to compute an "integrated health value" for each city. The higher the number, the less healthy a city's population.

They found that some cities, such as Detroit, had poor overall health during that timeframe. Others, such as San Francisco, had low values, indicating more favorable health outcomes. Some cities showed high variability over the four year period, such as Philadelphia. Cross-sectional comparisons showed tendencies for California cities to score better than other parts of the country. Also, Midwestern cities, including Denver, Minneapolis, and Chicago, appeared to perform poorly compared to other regions, contrary to national city rankings.

Convertino believes that methods like this, fed by large data sets and analysed via automated stochastic supercomputer models, could be used to optimize research and practice; for example for guiding optimal decisions about health. "These tools can be used by any country, at any administrative level, to process data in real-time and help personalize medical efforts," says Convertino.

But it is not just for health - "The model can be applied to any complex system to determine their Optimal Information Network, in fields from ecology and biology to finance and technology. Untangling their complexities and developing unbiased systemic indicators can help improve decision-making processes," Convertino added.

UCI, other astronomers find plane of dwarf satellites orbiting Centaurus A

An international team of astronomers has determined that Centaurus A, a massive elliptical galaxy 13 million light-years from Earth, is accompanied by a number of dwarf satellite galaxies orbiting the main body in a narrow disk. In a paper published today in Science, the researchers note that this is the first time such a galactic arrangement has been observed outside the Local Group, home to the Milky Way.

"The significance of this finding is that it calls into question the validity of certain cosmological models and simulations as explanations for the distribution of host and satellite galaxies in the universe," said co-author Marcel Pawlowski, a Hubble Fellow in the Department of Physics & Astronomy at the University of California, Irvine. 

He said that under the lambda cold dark matter model, smaller systems of stars should be more or less randomly scattered around their anchoring galaxies and should move in all directions. Yet Centaurus A is the third documented example, behind the Milky Way and Andromeda, of a "vast polar structure" in which satellite dwarves co-rotate around a central galactic mass in what Pawlowski calls "preferentially oriented alignment."

The difficulty of studying the movements of dwarf satellites around their hosts varies according to the target galaxy group. It's relatively easy for the Milky Way. "You get proper motions," Pawlowski said. "You take a picture now, wait three years or more, and then take another picture to see how the stars have moved; that gives you the tangential velocity."

Using this technique, scientists have measurements for 11 Milky Way satellite galaxies, eight of which are orbiting in a tight disk perpendicular to the spiral galaxy's plane. There are probably other satellites in the system that can't be seen from Earth because they're blocked by the Milky Way's dusty disk.

Andromeda provides observers on Earth a view of the full distribution of satellites around the galaxy's sprawling spiral. An earlier study found 27 dwarf galaxies, 15 arranged in a narrow plane. And Andromeda offers another advantage, according to Pawlowski: "Because you see the galaxy almost edge-on, you can look at the line-of-sight velocities of its satellites to see the ones that are approaching and those that are receding, so it very clearly presents as a rotating disk."

Centaurus A is much farther away, and its satellite companions are faint, making it more difficult to accurately measure distances and velocities to determine movements and distributions. But "sleeping in the archives," Pawlowski said, were data on 16 of Centaurus A's satellites.

"We could do the same game as with Andromeda, where we look at the line-of-sight velocities," he said. "And again we see that half of them are red-shifted, meaning they are receding from us, and the other half are blue-shifted, which tells us they are approaching."

The researchers were able to demonstrate that 14 of the 16 Centaurus A satellite galaxies follow a common motion pattern and rotate along the plane around the main galaxy - contradicting frequently used cosmological models and simulations suggesting that only about 0.5 percent of satellite galaxy systems in the nearby universe should exhibit this pattern.

"So this means that we are missing something," Pawlowski said. "Either the simulations lack some important ingredient, or the underlying model is wrong. This research may be seen as support for looking into alternative models."

This is Sandeep Kumar, an assistant professor of mechanical engineering at UC Riverside.

Discoveries will help realize the promise of faster, energy-efficient spintronic supercomputers and ultra-high-capacity data storage

Engineers at the University of California, Riverside, have reported advances in so-called "spintronic" devices that will help lead to a new technology for computing and data storage. They have developed methods to detect signals from spintronic components made of low-cost metals and silicon, which overcomes a major barrier to wide application of spintronics. Previously such devices depended on complex structures that used rare and expensive metals such as platinum. The researchers were led by Sandeep Kumar, an assistant professor of mechanical engineering. 

UCR researchers have developed methods to detect signals from spintronic components made of low-cost metals and silicon.
UCR researchers have developed methods to detect signals from spintronic components made of low-cost metals and silicon.

Spintronic devices promise to solve major problems in today's electronic computers, in that the computers use massive amounts of electricity and generate heat that requires expending even more energy for cooling. By contrast, spintronic devices generate little heat and use relatively minuscule amounts of electricity. Spintronic supercomputers would require no energy to maintain data in memory. They would also start instantly and have the potential to be far more powerful than today's computers.

While electronics depends on the charge of electrons to generate the binary ones or zeroes of computer data, spintronics depends on the property of electrons called spin. Spintronic materials register binary data via the "up" or "down" spin orientation of electrons--like the north and south of bar magnets--in the materials. A major barrier to development of spintronics devices is generating and detecting the infinitesimal electric spin signals in spintronic materials.

In one paper published in the January issue of the scientific journal Applied Physics Letters, Kumar and colleagues reported an efficient technique of detecting the spin currents in a simple two-layer sandwich of silicon and a nickel-iron alloy called Permalloy. All three of the components are both inexpensive and abundant and could provide the basis for commercial spintronic devices. They also operate at room temperature. The layers were created with the widely used electronics manufacturing processes called sputtering. Co-authors of the paper were graduate students Ravindra Bhardwaj and Paul Lou.

In their experiments, the researchers heated one side of the Permalloy-silicon bi-layer sandwich to create a temperature gradient, which generated an electrical voltage in the bi-layer. The voltage was due to phenomenon known as the spin-Seebeck effect. The engineers found that they could detect the resulting "spin current" in the bi-layer due to another phenomenon known as the "inverse spin-Hall effect."

The researchers said their findings will have application to efficient magnetic switching in computer memories, and "these scientific breakthroughs may give impetus" to development of such devices. More broadly, they concluded, "These results bring the ubiquitous Si (silicon) to forefront of spintronics research and will lay the foundation of energy efficient Si spintronics and Si spin caloritronics devices."

In two other scientific papers, the researchers demonstrated that they could generate a key property for spintronics materials, called antiferromagnetism, in silicon. The achievement opens an important pathway to commercial spintronics, said the researchers, given that silicon is inexpensive and can be manufactured using a mature technology with a long history of application in electronics.

Ferromagnetism is the property of magnetic materials in which the magnetic poles of the atoms are aligned in the same direction. In contrast, antiferromagnetism is a property in which the neighboring atoms are magnetically oriented in opposite directions. These "magnetic moments" are due to the spin of electrons in the atoms, and is central to the application of the materials in spintronics.

In the two papers, Kumar and Lou reported detecting antiferromagnetism in the two types of silicon--called n-type and p-type--used in transistors and other electronic components. N-type semiconductor silicon is "doped" with substances that cause it to have an abundance of negatively-charged electrons; and p-type silicon is doped to have a large concentration of positively charged "holes." Combining the two types enables switching of current in such devices as transistors used in computer memories and other electronics.

In the paper in the Journal of Magnetism and Magnetic Materials, Lou and Kumar reported detecting the spin-Hall effect and antiferromagnetism in n-silicon. Their experiments used a multilayer thin film comprising palladium, nickel-iron Permalloy, manganese oxide and n-silicon.

And in the second paper, in the scientific journal physica status solidi, they reported detecting in p-silicon spin-driven antiferromagnetism and a transition of silicon between metal and insulator properties. Those experiments used a thin film similar to those with the n-silicon.

The researchers wrote in the latter paper that "The observed emergent antiferromagnetic behavior may lay the foundation of Si (silicon) spintronics and may change every field involving Si thin films. These experiments also present potential electric control of magnetic behavior using simple semiconductor electronics physics. The observed large change in resistance and doping dependence of phase transformation encourages the development of antiferromagnetic and phase change spintronics devices."

In further studies, Kumar and his colleagues are developing technology to switch spin currents on and off in the materials, with the ultimate goal of creating a spin transistor. They are also working to generate larger, higher-voltage spintronic chips. The result of their work could be extremely low-power, compact transmitters and sensors, as well as energy-efficient data storage and computer memories, said Kumar.

The study's findings will help train artificial intelligence to diagnose diseases

Researchers used machine learning techniques, including natural language processing algorithms, to identify clinical concepts in radiologist reports for CT scans, according to a study conducted at the Icahn School of Medicine at Mount Sinai and published today in the journal Radiology. The technology is an important first step in the development of artificial intelligence that could interpret scans and diagnose conditions.

From an ATM reading handwriting on a check to Facebook suggesting a photo tag for a friend, computer vision powered by artificial intelligence is increasingly common in daily life. Artificial intelligence could one day help radiologists interpret X-rays, computed tomography (CT) scans, and magnetic resonance imaging (MRI) studies. But for the technology to be effective in the medical arena, computer software must be "taught" the difference between a normal study and abnormal findings.

This study aimed to train this technology how to understand text reports written by radiologists. Researchers created a series of algorithms to teach the computer clusters of phrases. Examples of terminology included words like phospholipid, heartburn, and colonoscopy.

Researchers trained the computer software using 96,303 radiologist reports associated with head CT scans performed at The Mount Sinai Hospital and Mount Sinai Queens between 2010 and 2016. To characterize the "lexical complexity" of radiologist reports, researchers calculated metrics that reflected the variety of language used in these reports and compared these to other large collections of text: thousands of books, Reuters news stories, inpatient physician notes, and Amazon product reviews.

"The language used in radiology has a natural structure, which makes it amenable to machine learning," says senior author Eric Oermann, MD, Instructor in the Department of Neurosurgery at the Icahn School of Medicine at Mount Sinai. "Machine learning models built upon massive radiological text datasets can facilitate the training of future artificial intelligence-based systems for analyzing radiological images."

Deep learning describes a subcategory of machine learning that uses multiple layers of neural networks (computer systems that learn progressively) to perform inference, requiring large amounts of training data to achieve high accuracy. Techniques used in this study led to an accuracy of 91 percent, demonstrating that it is possible to automatically identify concepts in text from the complex domain of radiology.

"The ultimate goal is to create algorithms that help doctors accurately diagnose patients," says first author John Zech, a medical student at the Icahn School of Medicine at Mount Sinai. "Deep learning has many potential applications in radiology -- triaging to identify studies that require immediate evaluation, flagging abnormal parts of cross-sectional imaging for further review, characterizing masses concerning for malignancy -- and those applications will require many labeled training examples."

"Research like this turns big data into useful data and is the critical first step in harnessing the power of artificial intelligence to help patients," says study co-author Joshua Bederson, MD, Professor and System Chair for the Department of Neurosurgery at Mount Sinai Health System and Clinical Director of the Neurosurgery Simulation Core.

CAPTION Ritesh Agarwal's research on photonic computing has been focused on finding the right combination and physical configuration of materials that can amplify and mix light waves in ways that are analogous to electronic computer components. In a paper published in Nature Communications, he and his colleagues have taken an important step: precisely controlling the mixing of optical signals via tailored electric fields, and obtaining outputs with a near perfect contrast and extremely large on/off ratios. Those properties are key to the creation of a working optical transistor.  CREDIT Sajal Dhara

Current computer systems represent bits of information, the 1's and 0's of binary code, with electricity. Circuit elements, such as transistors, operate on these electric signals, producing outputs that are dependent on their inputs.

As fast and powerful as supercomputers have become, Ritesh Agarwal, professor in the Department of Materials Science and Engineering in the University of Pennsylvania's School of Engineering and Applied Science, knows they could be more powerful. The field of photonic supercomputing aims to achieve that goal by using light as the medium.

Agarwal's research on photonic supercomputing has been focused on finding the right combination and physical configuration of materials that can amplify and mix light waves in ways that are analogous to electronic computer components.

In a paper published in Nature Communications, he and his colleagues have taken an important step: precisely controlling the mixing of optical signals via tailored electric fields, and obtaining outputs with a near perfect contrast and extremely large on/off ratios. Those properties are key to the creation of a working optical transistor.

"Currently, to compute '5+7,' we need to send an electrical signal for '5' and an electrical signal for '7,' and the transistor does the mixing to produce an electrical signal for '12,'" Agarwal said. "One of the hurdles in doing this with light is that materials that are able to mix optical signals also tend to have very strong background signals as well. That background signal would drastically reduce the contrast and on/off ratios leading to errors in the output."

With background signals washing out the intended output, necessarily computational qualities for optical transistors, such as their on/off ratio, modulation strength and signal mixing contrast have all been extremely poor. Electric transistors have high standards for these qualities to prevent errors.

The search for materials that can serve in optical transistors is complicated by additional property requirements. Only "nonlinear" materials are capable of this kind of optical signal mixing.

To address this issue, Agarwal's research group started by finding a system which has no background signal to start: a nanoscale "belt" made out of cadmium sulfide. Then, by applying an electrical field across the nanobelt, Agarwal and his colleagues were able to introduce optical nonlinearities to the system that enable a signal mixing output that was otherwise zero.

"Our system turns on from zero to extremely large values, and hence has perfect contrast, as well as large modulation and on/off ratios," Agarwal said. "Therefore, for the first time, we have an optical device with output that truly resembles an electronic transistor."

With one of the key components coming into focus, the next steps toward a photonic computer will involve integrating them with optical interconnects, modulators, and detectors in order to demonstrate actual computation.

The Data Science Institute (DSI) at Columbia awarded Seeds Fund Grants to five research teams whose proposed projects will use state-of-the-art data science to solve seemingly intractable societal problems in the fields of cancer research, medical science, transportation and technology.

Each team will receive up to $100,000 for one year and be eligible for a second year of funding.

"In awarding these grants, the DSI review committee selected projects that brought together teams of scholars who aspire to push the state-of-the-art in data science by conducting novel, ethical and socially beneficial research," said Jeannette M.Wing, Avanessians Director of the Data Science Institute. "The five winning teams combine data-science experts with domain experts who'll work together to transform several fields throughout the university."

The inaugural DSI Seeds Fund Program supports collaborations between faculty members and researchers from various disciplines, departments and schools throughout Columbia University. DSI received several excellent proposals from faculty, which shows a growing and enthusiastic interest in data-science research at Columbia, added Wing.

The seed program is just one of many initiatives that Wing has spearheaded since the summer of 2017, when she was named director of DSI, a world-leading institution in field of data science. The other initiatives include founding a Post-doctoral Fellows Program; a Faculty Recruiting Program; and an Undergraduate Research Program.

What follows are brief descriptions of the winning projects, along with the names and affiliations of the researchers on each team.

p(true): Distilling Truth by Community Rating of Claims on the Web:

The team: Nikolaus Kriegeskorte, Professor, Psychology and Director of Cognitive Imaging, Zuckerman's Institute; Chris Wiggins, Associate Professor, Department of Applied Physics and Applied Mathematics, Columbia Engineering; Nima Mesgarani, Assistant Professor, Electrical Engineering Department, Columbia Engineering; Trenton Jerde, Lecturer, Applied Analytics Program, School of Professional Studies.

The social web is driven by feedback mechanisms, or "likes," which emotionalize the sharing culture and may contribute to the formation of echo chambers and political polarization, according to this team. In their p(true) project, the team will thus build a complementary mechanism for web-based sharing of reasoned judgments, so as to bring rationality to the social web.

Websites such as Wikipedia and Stack Overflow are surprisingly successful at providing a reliable representation of uncontroversial textbook knowledge, the team says. But the web doesn't work well in distilling the probability of contentious claims. The question the team seeks to answer is this: How can the web best be used to enable people to share their judgments and work together to find the truth?

The web gives users amazing power to communicate and collaborate, but users have yet to learn how to use that power to distill the truth, the team says. Web users can access content, share it with others, and give instant feedback on claims. But those actions end up boosting certain claims while blocking others, which amounts to a powerful mechanism of amplification and filtering. If the web is to help people think well together, then the mechanism that determines what information is amplified, the team maintains, should be based on rational judgment, rather than emotional responses as communicated by "likes" and other emoticons.

The team's goal is to build a website, called p(true), that enables people to rate and discuss claims (e.g., a New York Times headline) on the web. The team's method will enable the community to debunk falsehoods and lend support to solid claims. It's also a way to share reasoned judgments, pose questions to the community and start conversations.

Planetary Linguistics:

The team: David Kipping, Assistant Professor, Department of Astronomy; Michael Collins, Professor, Computer Science Department, Columbia Engineering.

In the last few years, the catalog of known exoplanets - planets orbiting other stars - has swelled from a few hundred to several thousand. With 200 billion stars in our galaxy alone and with the rise of online planet-hunting missions, an explosion of discovery of exoplanets seems imminent. Indeed, it has been argued that the cumulative number of known exoplanets doubles every 27 months, analogous to Moore's law, implying a million exoplanets by 2034, and a billion by 2057.

By studying the thousands of extrasolar planets that have been discovered in recent years, can researchers infer the rules by which planetary systems emerge and evolve? This team aims to answer that question by drawing upon an unusually interesting source: computational linguistics.

The challenge of understanding the emergence of planetary systems must be done from afar with limited information, according to the team. In that way, it's similar to attempting to learn an unknown language from snippets of conversation, says Kipping and Collins. Drawing upon this analogy, the two will explore whether the mathematical tools of computational linguistics can reveal the "grammatical" rules of planet formation. Their goals include building predictive models, much like the predictive text on a smartphone, which will be capable of intelligently optimizing telescope resources. They also aim to uncover the rules and regularities in planetary systems, specifically through the application of grammar-induction methods used in computational linguistics.

The team maintains that the pre-existing tools of linguistics and information theory are ideally suited for unlocking the rules of planet formation. Using this framework, they'll consider each planet to be analogous to a word; each planetary system to be a sentence; and the galaxy to be a corpus of text from which they might blindly infer the grammatical rules. The basic thesis of their project is to see whether they can use the well-established mathematical tools of linguistics and information theory to improve our understanding of exoplanetary systems.

The three central questions they hope to answer are: Is it possible to build predictive models that assign a probability distribution over different planetary-system architectures? Can those models predict the presence and properties of missing planets in planetary systems? And can they uncover the rules and regularities in planetary systems, specifically through the application of grammar-induction methods used in computational linguistics?

In short, the team says it intends to "speak planet."

A Game-Theoretical Framework for Modeling Strategic Interactions Between Autonomous and Human-Driven Vehicles:

The team: Xuan Sharon Di, Assistant Professor, Department of Civil Engineering and Engineering Mechanics, Columbia Engineering; Qiang Du, Professor, Applied Physics and Applied Mathematics, Columbia Engineering and Data Science Institute; Xi Chen, Associate Professor, Computer Science Department, Columbia Engineering; Eric Talley, Professor and Co-Director, Millstein Center for Global Markets and Corporate Ownership, Columbia Law School.

Autonomous vehicles, expected to arrive on roads over the course of the next decade, will connect with each other and to the surrounding environment by sending messages regarding timestamp, current location, speed and more.

Yet no one knows exactly how autonomous vehicles (AV) and conventional human-driven vehicles (HV) will co-exist and interact on roads. The vast majority of research has considered the engineering challenges from the perspective either of a single AV immersed in a world of human drivers or one in which only AVs are on the road. Much less research has focused on the transition path between these two scenarios. To fill that gap, this team will develop a framework using the game-theoretic approach to model strategic interactions of HVs and AVs. Their approach aims to understand the strategic interactions likely to link the two types of vehicles.

Along with exploring these technical matters, the team will address the ethical issues associated with autonomous vehicles. What decision should an AV make when confronted with an obstacle? If it swerves to miss the obstacle, it will hit five people, including an old woman and a child, whereas if it goes straight it will hit a car and injure five passengers. Algorithms are designed to solve such problems, but bias can creep in depending on how one frames the problem. The team will investigate how to design algorithms that account for such ethical dilemmas while eliminating bias.

Predicting Personalized Cancer Therapies Using Deep Probabilistic Modeling:

The team: David Blei, Professor, Statistics and Computer Science Department, Columbia Engineering; Raul Rabadan, Professor, Systems Biology and Biomedical Informatics, CUMC; Anna Lasorella, Associate Professor, Pathology and Cell Biology and Pediatrics, CUMC, Wesley Tansey, Postdoc Research Scientist, Department of Systems Biology, CUMC

Precision medicine aims to find the right drug for the right patient at the right moment and with the proper dosage. Such accuracy is especially relevant in cancer treatments, where standard therapies elicit different responses from different patients.

Cancers are caused by the accumulation of alterations in the genome. Exact causes vary between different tumors, and no two tumors have the same set of alterations. One can envision that by mapping a comprehensive set of causes of specific tumors, one can provide patient-specific drug recommendations. And drugs targeting specific alterations in the genome provide some of the most successful therapies.

This team has identified new alterations targeted by drugs that are either currently approved or in clinical trials. Most tumors, however, lack specific targets and patient-specific therapies. A recent study of 2,600 patients at the M.D. Anderson Center for instance showed that genetic analysis permits only 6.4 percent of patients to be paired with a drug aimed specifically at the mutation deemed responsible. This team believes the low percentage highlights the need for new approaches that will match drugs to genomic profiles

The team's goal, therefore, is to model, predict, and target therapeutic sensitivity and resistance of cancer. The predictions will be validated in cancer models designed in Dr. Lasorella's lab, which will enhance the accuracy of effort. They will also integrate Bayesian modeling with variational inference and deep-learning methods, leveraging the expertise of two leading teams in computational genomics (Rabadan's group) and machine learning (Blei's group) across the Medical and Morningside Heights campuses.

They have sequenced RNA and DNA from large collections of tumors and have access to databases of genomic, transcriptomic and drug response profiles of more than 1,000 cancer-cell lines. They will use their expertise in molecular characterization of tumors and machine-learning techniques to model and explore these databases.

Enabling Small-Data Medical Research with Private Transferable Knowledge from Big Data:

The team: Roxana Geambasu, Associate Professor, Computer Science Department, Columbia Engineering; Nicholas Tatonetti, Herbert Irving Assistant Professor, Biomedical Informatics, CUMC; Daniel Hsu, Assistant Professor, Computer Science Department, Columbia Engineering.

Clinical data hold enormous promise to advance medical science. A major roadblock to more research in this area is the lack of infrastructural support for sharing of large-scale, clinical datasets across institutions. Virtually every hospital and clinic collects detailed medical records about its patients. For example, New York Presbyterian has developed a large-scale, longitudinal dataset, called the Clinical Data Warehouse that contains clinical information from more than five million patients.

Daniel Hsu

Such datasets are made available to researchers within their home institutions through the Institutional Review Board (IRB), but hospitals are usually wary of sharing data with other institutions. Their main concern is privacy and the legal and public-relations implications of a potential data breach. At times, hospitals will permit the sharing of statistics from their data but generally cross-institution IRBs take years to finalize, often with rejection. Siloing these large-scale, clinical datasets severely constrains the quality and pace of innovation in data-driven medical science. The siloing effect limits the scope and rigor of the research that can be conducted on these datasets.

To overcome this challenge, the team is building an infrastructure system for sharing machine-learning models of large-scale, clinical datasets that will also preserve patients' privacy. The new system will enable medical researchers in small clinics or pharmaceutical companies to incorporate multitask feature models that are learned from big clinical datasets. The researchers, for example, could call upon New York Presbyterian's Clinical Data Warehouse and bootstrap their own machine-learning models on top of smaller clinical datasets. The multitask feature models, moreover, will protect the privacy of patient records in the large datasets through a rigorous method called differential privacy. The team anticipates that the system will vastly improve the pace of innovation in clinical-data research while alleviating privacy concerns.

New tools will fight skin condition, personalize diagnosis and management, and identify leads for drug treatment

An experienced interdisciplinary team of psoriasis and computational researchers from Case Western Reserve University School of Medicine (CWRU SOM) and University Hospitals Cleveland Medical Center (UHCMC) has received a $6.5M, 5-year grant from the National Institute of Arthritis, Musculoskeletal and Skin Diseases (NIAMS).

The grant supports a Center of Research Translation in Psoriasis (CORT) at CWRU and UHCMC.

"The CORT brings together the strengths of the Department of Dermatology and the Murdough Family Center for Psoriasis in psoriasis care and research with the innovative approaches of our Institute for Computational Biology, and Department of Population & Quantitative Health Sciences (PQHS)," said Kevin Cooper, MD who serves as NIH Contact Principal Investigator and Administrative Director of the Center.

The CORT will integrate cutting-edge technology and bioinformatics with basic and clinical science in order to advance translational discovery and application in psoriasis. The goal is to better identify and treat those psoriasis patients that are more susceptible to developing comorbidities (simultaneous medical conditions) associated with psoriasis, such as cardiovascular disease, diabetes, depression, and psoriatic arthritis. Currently it is difficult for physicians to determine which patients will develop these comorbidities.

The researchers will cull data collected from blood and skin samples of UHCMC psoriasis patients and preclinical models, looking for new patterns and relationships developed using a systems biology approach. The investigative team will combine these data with psoriasis-patient information from CLEARPATH, an Ohio-based database that integrates electronic medical records (EMR) from multiple hospital systems.

"Armed with this large pool of data and new ways to work with it, we can make better connections between groups of patients with similar forms of psoriasis versus an individual's unique biology and therapy options," said Mark Cameron, PhD from the department of PQHS, who leads the computational biology team.

This approach will use tailored computational processes to zero in on drug candidates or repurposed drugs matching the patient profiles from a large database of tens of thousands of drugs--and test the drugs in genetically engineered psoriasis mouse models. The hope is that drugs that are successful in treating the mice will advance to humans.

"The CWRU/UHCMC CORT is a focal point for new and innovative mouse models that mimic psoriasis in humans--including, crucially, comorbidities of human psoriasis patients," said the project's preclinical lead investigator, Nicole Ward, PhD, from the Department of Dermatology.

"We're getting better and better at managing psoriasis patients' skin disease," said Cooper. "But we still don't have a complete grasp of the comorbidities. Through this form of personalized medicine, we think we can make great strides in determining which psoriasis patients are likely to suffer from the various co-occurring ailments, ultimately fashioning treatments for them."

Published study involving Yale-NUS undergraduates provides evidence of similarities between two different mathematical models impacting magnetoresistance research

Two Yale-NUS College in Singapore undergraduates are part of a research team that concluded that two different mathematical models, which describe the same physical phenomenon, are essentially equivalent. The discovery could have implications for future research into magnetoresistance and its practical applications in a wide range of electronic devices. After implementing the two different models of magnetoresistance as supercomputer simulations, Lai Ying Tong, 21, and Silvia Lara, 22, found that the two simulations produced similar results under identical conditions. Magnetoresistance is a physical phenomenon where the electric resistivity of a material changes when subjected to a magnetic field. The research was published in the peer-reviewed journal Physical Review B in December 2017, and presented at international conferences in 2016 and 2017.

The two Yale-NUS undergraduate students worked on the project under the mentorship of Associate Professor Shaffique Adam from Yale-NUS College and the Department of Physics at the National University of Singapore's (NUS) Faculty of Science, and Associate Professor Meera Parish from Monash University. They were guided by Navneeth Ramakrishnan, a Masters student at the Department of Physics at the NUS Faculty of Science and NUS Centre for Advanced 2D Materials, who checked their results and wrote the paper. The findings provided a unified theoretical framework to understand a phenomenon known as 'linear unsaturating magnetoresistance', as well as clear predictions on how to manipulate the effect. Prior to their research, two separate theoretical mathematical models had been proposed to describe how the phenomenon works: the Random Resistance Network (RRN) model and the Effective Medium Theory (EMT) model. Empiricists exploring magnetoresistance generally refer to either of these two models to contextualise their experiments, but do not provide a detailed comparison between the theories and their experimental results. This latest finding not only unifies the two existing theories, but also validates that these theories are accurate descriptions which match with experimental data.

The findings have a direct impact on future research into magnetoresistance, which has practical applications in a diverse range of electronic devices, such as speed sensors, mobile phones, washing machines, and laptops. The principles of magnetoresistance are currently used in magnetic memory storage in hard drives, and certain companies are aiming to produce sensitive magnetometers - devices which measure magnetic fields - that can operate at room temperatures. This is a billion dollar industry which supports applications in many aspects of everyday life ranging from automobile collision warnings to traffic light burnout detection.

Ms Lai and Ms Lara began this research as a summer research project in their first year of undergraduate education, under the guidance of Assoc Prof Adam, who is also with the Centre for Advanced 2D Materials at NUS. Assoc Prof Adam highlighted both students' roles in the research, noting that they reviewed existing literature, implemented the mathematical models in a software environment, MATLAB, as well as ran the simulations and the subsequent analyses. The students also presented the research findings at international conferences, such as the American Physical Society March Meeting 2017.

Yale-NUS College funded the undergraduate students to work on this project. "This level of undergraduate engagement, not only in the research, but in shaping the direction of the work is extremely rare. At Yale-NUS, science students are able to actively participate in such research very early on in their learning experience," said Assoc Prof Adam.

  1. Fujitsu president Tatsuya Tanaka receives the French government's prestigious Legion of Honor
  2. Dutch researchers accelerate the pace of development of silicon quantum supercomputer chip
  3. Belgian Ph.D. student decodes DNA, wins a Bitcoin
  4. Artificial intelligence predicts corruption in Spain
  5. UK scientists run new simulations that show Sanchi oil spill contamination could reach Japan within a month
  6. BU med school uses new AI to significantly improve human kidney analysis
  7. KU researchers use machine learning to predict new details of geothermal heat flux beneath the Greenland Ice Sheet
  8. SETI project homes in on strange 'fast radio bursts'
  9. NYC Health Department spots 10 outbreaks of foodborne illness using Yelp reviews since 2012
  10. CaseMed researchers win $2.8 million to repurpose FDA-approved drugs to treat Alzheimer's disease
  11. UMN makes new discovery that improves brain-like memory, supercomputing
  12. Activist investor Starboard calls for substantial change at Israel’s Mellanox
  13. KU researchers make solar energy more efficient
  14. Researchers discover serious processor vulnerability
  15. We need one global network of 1000 stations to build an Earth observatory
  16. Statistical test relates pathogen mutation to infectious disease progression
  17. Reich lab creates collaborative network for combining models to improve flu season forecasts
  18. The VLA detected radio waves point to likely explanation for neutron-star merger phenomena
  19. Germany’s Tübingen University physicists are the first to link atoms and superconductors in key step towards new hardware for quantum supercomputers, networks
  20. How great is the influence, risk of social, political bots?

Page 3 of 42