In 1822, Charles Babbage invented difference engine – an automatic mechanical calculator invented to tabulate polynomial functions. Even since then, computers need an instruction to perform a specific task. This means of providing instructions to computers, in order to fulfil certain task is known as programming language.
Initially, programming languages were composed of sequence of steps to create a program. These altered into series of steps keyed into machines and later executed. But with time and evolution in Information Technologies, these languages gained advanced features like logical branching and object orientation.
Charles Babbage`s invention of differential engine could only perform tasks by changing the gears to execute calculations. Which means, the earliest form of programming was dependent on physical motion. This physical motion was later surpassed by electrical signals when the US Government built ENIAC in 1942. Following the same principles of Babbage`s invention, programming by pre-setting the switches and restructuring wiring for entire system seemed tedious and time consuming.
The complexity of programming with ENIAC was overshadowed by two concepts developed by John Von Neumann when working at Institute for Advanced Study. The first “shared program technique” stated that the computer hardware should be simple and not need to be re-wired every time for new program. Rather, complex instructions should instruct the hardware to be reprogrammed much faster.
The second “conditional control transfer” gave birth to the concept of subroutines revealing the notion of code blocks that could be jumped into from a current control block instead of traditional chronologically ordered steps to follow. This idea introduced IF THEN expressions and looping with FOR statements along with concept of ‘libraries’ as a medium to reuse code blocks. Then a German Konrad Zuse (http://www.epemag.com/zuse/), independently invented many similar notions in his computer system and Plankalkul programming language. It took time for his work to reach the public until 1949 when his work as the ‘language short-code’ (www.byte.com) appeared. This was the first programming language for electronic devices, and this needed programmer to change its statement`s into binary codes i.e. 0`s and 1`s manually. This was initial step towards complex languages of today a bit faster as programmer didn’t have to work manually on this.
FORTRAN appeared in 1957 as first major language for FORmula TRANslation as the name signifies. It was designed for scientific computing by IBM. FORTAN was good at handling numbers but it was not good at handling input and output. So, COBOL was introduced. It in addition to working with numbers could work with string that could be grouped into arrays and records, so that data could be tracked better.
In 1958, John McCarthy of MIT invented LISt Processing (or LISP) language designed for Artificial Intelligence (AI) research. The basic only type of data in this programming language was list which later acquired new data types in mid-1960’s.
The Algol language was created in 1958 by a committee for scientific use. Ideally it was first language with a formal grammar. This became root for languages such as Pascal, C, C++, Java, Visual Basic and Perl. Later, many programming concepts like object-oriented programming and design patterns were introduced after Algol language with time. Most widely used programming languages are like grandchildren from these ancestors. Programming languages have been evolving since early invention of Charles Babbage till date and it will continue with years to come. Programming languages will only get more flexible, robust and optimized so that human life can be made easier to perform various task they want to in fields like Health, Biology, Physics, Artificial Intelligence, Air Force and much more.