The Golden 20s of the 20th century began with the end of WW1 and ended with the stock market crash of 1929. It was considered the decade that was the calm before the storm. However, it was a decade of sweeping technological change and innovation that transformed industrial productivity, life in general and in turn the world economy.
It improved the general well-being of many. Electricity, the internal combustion engine and radio changed everyday life forever.

Still, a lot has changed in a hundred years since then. The invention of the transistor happened over 70 years ago and the Internet has been in the mainstream for 25 years now. Commodity cloud computing has been around for over 15 years, thanks AWS.

Looking back at the 2010s though there have not been any earth-shattering technological inventions. Don't get me wrong, there have been numerous incredible improvements to existing technology in cost, speed and pervasiveness. The focus has been more horizontal than vertical.

My predictions for the 20s are that we are going to see some major changes technologically. Why ? There are just too many forcing functions that can no longer be ignored in the tech industry. Corporations that sell user data, corporations that ignore data security are all losing market share and quickly at that.
Internet-enabled cyber-crime estimates from the FBI put the growing cost at around $10 billion for the 2010s and that number will only continue to rise. The rapid growth of IoT devices compounds this issue.


The world's computing devices are still largely based on the Von Neumann Architecture which while successful in getting us to this point has fundamental design issues that effect security which cannot be fixed.
A Microsoft study found what many have long known ... that 70% of all security vulnerabilities are due to memory safety issues.
Some of the risks can be mitigated by moving away from unsafe languages such as C/C++ (*Rust slowly raises it's hand in the back of the class*) and factoring in security as part of the system design process.
Intel has failed at cyber-security for decades. SGE shows that they are not taking security seriously, it's still mostly about marketing and money, symptoms and not the disease.
Open hardware designs such as RISC-V are enabling hardware-enforced software-defined security that address the fundamental problems.
 
These challenges will very likely bring about a long overdue shift in the tech industry away from being focused on hardware and software performance at the expense of security and safety. The companies that don't will at best lose major market share and at worst quickly go out of business. The stakes around security are just too high now: nation states, hacktivists, terrorist groups and criminal syndicates have more leverage than ever before over systems and companies that don't take on the responsibility of securing physical, digital and financial assests tied to technology.

Without addressing security in a fundamental way there will be no Artificial General Intelligence era, no Self-Driving Cars, no Fin-Tech revolution. Security un-addressed could result in a back-lash against technology in the 2020s in much the same way that the 1920s saw a resurgence of the Luddites (especially with an economic event like 1929).  

It's time for the tech industry to take responsibility for what it creates, tech-bros and corporate marketing simply brushing aside those who point out the risks as "techno-phobes" could be a much more dangerous proposition in a near-future society. We aren't headed into a tech utopia in the 20s but we are headed into a golden age if we face the challenges ahead of us.

Fritz Lang's 1927 Metropolis is probably one of the earliest and greatest sci-fi films ever. It came with a warning.