In the past, GitHub CEO had already made a prediction which says that future of coding is no coding at all. A similar speculation has been made by the researchers at the Oak Ridge National Laboratory, Tennessee, who have said that machines will write most of their own code by 2040.
In the paper (PDF) titled “Will humans even write code in 2040 and what would that mean for extreme heterogeneity in computing?” researchers state that the current programming trends and research efforts will result into Machine Generated Code (MGC) becoming as common as artificial intelligence in devices today.
This isn’t surprising, given the recent development of programs like Microsoft’s DeepCoder, Google’s AutoML, and DARPA’s Probabilistic Programming for Advancing Machine Learning (PPAML), etc. AutoML and DeepCoder already use machine learning to produce executable code.
With tools like DOG4DOG, entire knowledge bases could be generated. Also, with code-generation technologies like Eclipse Modeling Framework and Sirius, the entire data hierarchy, user interface, and middle layer could be produced.
It’s not hidden that the application programming interfaces in scientific libraries are also becoming more standardized. So, what’s missing? The paper calls a better understanding of the problem domain by machines as the final piece of the MGC puzzle.
Other major requirements for MGC to become a common practice would be more efficient languages for machine-to-machine communications and allocation of hardware resources for writing code. The paper also describes a number of important research directions that could be undertaken in near future.
What are your views on this upcoming and major change due to the evolution of Machine Generated Code (MGC)? Share your opinions and become a part of the discussion.