The world is surrounded by technology – technology that makes our jobs easy, the technology that makes our commute easy, the technology that makes out communication easy and so on. Hence, such advancements have turned into a boon to our lives while easing out numerous works that would conventionally take a long time to complete. Now that we look back we see so many new technologies have taken over the world that it’s nearly impossible to enlist them at once. And how further advancements will impact our lives in new ways we cannot even imagine.
MIT has drafted a list of top 10 strategic technology breakthroughs that will revolutionize our lives in the coming years.
An internet based on quantum physics will soon enable inherently secure communication. A team led by Stephanie Wehner, at Delft University of Technology, is building a network connecting four cities in the Netherlands entirely by means of quantum technology. Messages sent over this network will be unhackable.
The Delft network will be the first to transmit information between cities using quantum techniques from end to end.
The technology relies on a quantum behavior of atomic particles called entanglement. Entangled photons can’t be covertly read without disrupting their content.
Here’s a definition of a hopeless case: a child with a fatal disease so exceedingly rare that not only is there no treatment, there’s not even anyone in a lab coat studying it. “Too rare to care,” goes the saying.
That’s about to change, thanks to new classes of drugs that can be tailored to a person’s genes. If an extremely rare disease is caused by a specific DNA mistake—as several thousand are—there’s now at least a fighting chance for a genetic fix through hyper-personalized medicine. One such case is that of Mila Makovec, a little girl suffering from a devastating illness caused by a unique genetic mutation, who got a drug manufactured just for her. Her case made the New England Journal of Medicine in October after doctors moved from a readout of her genetic error to treatment in just a year. They called the drug milasen, after her. The treatment hasn’t cured Mila. But it seems to have stabilized her condition: it has reduced her seizures, and she has begun to stand and walk with assistance.
Mila’s treatment was possible because creating a gene medicine has never been faster or had a better chance of working. The new medicines might take the form of gene replacement, gene editing, or antisense (the type Mila received), a sort of molecular eraser, which erases or fixes erroneous genetic messages. What the treatments have in common is that they can be programmed, in digital fashion and with digital speed, to correct or compensate for inherited diseases, letter for DNA letter.
Last June Facebook unveiled a “global digital currency” called Libra. The idea triggered a backlash and Libra may never launch, at least not in the way it was originally envisioned. But it’s still made a difference: just days after Facebook’s announcement, an official from the People’s Bank of China implied that it would speed the development of its own digital currency in response. Now China is poised to become the first major economy to issue a digital version of its money, which it intends as a replacement for physical cash.
The first wave of a new class of anti-aging drugs has begun human testing. These drugs won’t let you live longer (yet) but aim to treat specific ailments by slowing or reversing a fundamental process of aging.
The drugs are called senolytics—they work by removing certain cells that accumulate as we age. Known as “senescent” cells, they can create low-level inflammation that suppresses normal mechanisms of cellular repair and creates a toxic environment for neighboring cells.
The universe of molecules that could be turned into potentially life-saving drugs is mind-boggling in size: researchers estimate the number at around 1060. That’s more than all the atoms in the solar system, offering virtually unlimited chemical possibilities—if only chemists could find the worthwhile ones.
Now machine-learning tools can explore large databases of existing molecules and their properties, using the information to generate new possibilities. This AI enabled technology could make it faster and cheaper to discover new drug candidates.
Satellites that can beam a broadband connection to internet terminals. As long as these terminals have a clear view of the sky, they can deliver the internet to any nearby devices. SpaceX alone wants to send more than 4.5 times more satellites into orbit this decade than humans have ever launched since Sputnik.
These mega-constellations are feasible because we have learned how to build smaller satellites and launch them more cheaply. During the space shuttle era, launching a satellite into space cost roughly US$24,800 per pound. A small communications satellite that weighed four tons cost nearly $200 million to fly up.
Quantum computers store and process data in a way completely different from the ones we’re all used to. In theory, they could tackle certain classes of problems that even the most powerful classical supercomputer imaginable would take millennia to solve, like breaking today’s cryptographic codes or simulating the precise behavior of molecules to help discover new drugs and materials.
There have been working quantum computers for several years, but it’s only under certain conditions that they outperform classical ones, and in October Google claimed the first such demonstration of “quantum supremacy.” A computer with 53 qubits—the basic unit of quantum computation—did a calculation in a little over three minutes that, by Google’s reckoning, would have taken the world’s biggest supercomputer 10,000 years, or 1.5 billion times as long. IBM challenged Google’s claim, saying the speedup would be a thousandfold at best; even so, it was a milestone, and each additional qubit will make the computer twice as fast.
AI has a problem: in the quest to build more powerful algorithms, researchers are using ever greater amounts of data and computing power and relying on centralized cloud services. This not only generates alarming amounts of carbon emissions but also limits the speed and privacy of AI applications.
But a countertrend of tiny AI is changing that. Tech giants and academic researchers are working on new algorithms to shrink existing deep-learning models without losing their capabilities. Meanwhile, an emerging generation of specialized AI chips promises to pack more computational power into tighter physical spaces, and train and run AI on far less energy.
In 2020, the US government has a big task: collect data on the country’s 330 million residents while keeping their identities private. The data is released in statistical tables that policymakers and academics analyze when writing legislation or conducting research. By law, the Census Bureau must make sure that it can’t lead back to any individuals.
But there are tricks to “de-anonymize” individuals, especially if the census data is combined with other public statistics.
So the Census Bureau injects inaccuracies, or “noise,” into the data. It might make some people younger and others older, or label some white people as black and vice versa while keeping the totals of each age or ethnic group the same. The more noise you inject, the harder the de-anonymization becomes.
Differential privacy is a mathematical technique that makes this process rigorous by measuring how much privacy increases when noise is added. The method is already used by Apple and Facebook to collect aggregate data without identifying particular users.
Climate change attribution
Ten days after Tropical Storm Imelda began flooding neighborhoods across the Houston area last September, a rapid-response research team announced that climate change almost certainly played a role.
The group, World Weather Attribution, had compared high-resolution computer simulations of worlds where climate change did and didn’t occur. In the former, the world we live in, the severe storm was as much as 2.6 times more likely—and up to 28% more intense.
Earlier this decade, scientists were reluctant to link any specific event to climate change. But many more extreme-weather attribution studies have been done in the last few years, and rapidly improving tools and techniques have made them more reliable and convincing.
This has been made possible by a combination of advances. For one, the lengthening record of detailed satellite data is helping us understand natural systems. Also, increased computing power means scientists can create higher-resolution simulations and conduct many more virtual experiments.
These and other improvements have allowed scientists to state with increasing statistical certainty that yes, global warming is often fueling more dangerous weather events.
By disentangling the role of climate change from other factors, the studies are telling us what kinds of risks we need to prepare for, including how much flooding to expect and how severe heatwaves will get as global warming becomes worse. If we choose to listen, they can help us understand how to rebuild our cities and infrastructure for a climate-changed world.