In Computer Science and applications, a Programming Language is a language used to program (e.g. instruct) computers.
In the early days, computer engineers and selected programmers have to program in Machine Language (with strings of Zeroes and Ones). They are due partly to the choice of Binary Number System as the basis of designing Arithmetic and Logic Unit inside the computer.
On the ICL 1902S computer, we often have to use the 24 keys to enter short pieces of Machine Code. That is history.
To bridge the human users and the computers, the next step was to use Assembly Languages such as
- Simple/Symbolic Assembly Language
- Macro Assembly Language.
A Macro Processor translates Macros (a well-defined group of Assembly Language instructions).
An Assembler translates a program in Assembly Language into Machine Language instructions.
History of Programming Languages
The development of the first 11 (or so) programming languages can be found in the first HOPL (History of Programming Languages) Conference.
Currently, there are thousands of programming languages (some for academic purposes) and a limited number used for production.
Evolution of Programming Style
Over the years, the style of programming evolved. The list is not exclusive.
- Procedural programming (e.g. telling the computer system what to do, emphasis on the “verbs”)
- Non-procedural programming (e.g. telling the computer system what one wants)
- Object Oriented programming (e.g. emphasis on the “nouns”)
- Functional programming (e.g. based on “functions”)
- Logic programming (e.g. based on “Horn logic” and similar logic systems)
- Top down step wise development
- Bottom up & Middle out techniques
- Artificial Intelligence (AI) programming
With each paradigm, there are several programming languages with known advantages and limitations.
There is a theoretical model called “Turing Machine”, which is primitive but has the computational power of modern computers.
The machine was proposed by Alan M. Turing (who has an ACM award named after him that is considered the “Nobel Prize in Computing“).
Alan Perlis, a pioneer Computer Scientist and Programming Language Designer, defined a “Turing Tar Pit, where everything is possible [to compute], but nothing is easy.”