GPT3 writing code. A compiler from natural language to code.— Flo Crivello (@Altimor) July 2, 2020
People don't understand — this will change absolutely everything. We're decoupling human horsepower from code production. The intellectual equivalent of the discovery of the engine. https://t.co/QGJbQRBdQv pic.twitter.com/CJIaRK8j0M
So I'm posting this thing early, because why not, but instead of it being a prank, it's more a prediction of the future.
Inference has been created to define a new paradigm of programming languages.
Where all previous languages required programming using a rigid syntax, where semantics were precise, and no leeway existed, Inference uses the power of neural networks to infer meaning from the programmer.
The basic model works like this:
Programmer writes code however they want
- var x = position + 8
- add a margin of 8 to the position - keep track of that for me (call it x)
- let a new variable (let's say x) hold that variable I just used, plus the fixed margin
Training
Regarding let a new variable (let's say x) hold that variable I just used, plus the fixed margin, did you mean:
a) int x = position + fixedMargin [92% certainty]
b) int x = position + [?? supply reference to "fixed margin"] [98% certainty if reference supplied]
c) int x = position + top [86% certainty as fixedMargin used after top]
d) [Let me know more precisely what you meant]As this is a trained system, the more consistently the style is written, the easier it will be to train. That might make it seem pointless, except it is able to store multiple profiles, for each developer, meaning multiple programmers can each use their own style, and train the model to understand what they each mean.
Performance
So far, this is in alpha, but with just 3 days of training on pseudocode written by 3 developers, it has been able to understand, compile and execute a simple Pong-style game, written in plain language by those same developers.More info
Code and infrastructure config will be released soon.Tweet at me for more info.