Along my studies and professional career I have used a large number of programming languages, as may have been the case for most people who has had a long technical career in software development.
I started using Basic, it was 1990, then I moved to Pascal and C. When I arrived to the university it was Modula-2 (a Pascal dialect, or maybe is the other way round), C became often C++ and other languages like Eiffel (that we had to use with a compiler limited to 100 classes). In addition to that, there were the mathematical languages Matlab, Mathematica, Maple or because I used it in that context Fortran and some other related to circuit design that I can’t even remember now despite the many hours spent. In Artificial Intelligence courses I used Clips, as far as I remember it was introduced as a variant of Lisp, maybe because of the use of parentheses, maybe for deeper reasons.
When I started working most of the languages evolved ( or I found their evolved version) and they got a V in front of them, Visual Basic, Visual C++ and even Pascal got a “Visual” version that was named Delphi. I also worked for a while in Java and Python.
This is still a very narrow perspective of available and popular technologies back in 2007. It was hopeless trying to keep track of all the programming languages. I was not the only, not the first in getting to that conclusion. The United States Department of Defense had got to the same conclusion in the early seventies. Here you can find a more comprehensive perspective of the existing programming languages.
Internally the DoD was using more than 100 different programming languages in a number of different systems. In order to address this issues, yes, it’s an issue when an organization has to deal with that, they developed a new programming language, ADA, that should be of universal use in the organization, I haven’t read though about the percentages of adoption in the organization and certainly it has not become very popular in general.
Yet, the solution to too many programming languages was to define a new one, not evolve one of the existing ones. This is probably just a matter of “how to name things” as the new language was for sure influenced by the existing languages but still it matters as it gives us hints that authorship and collaboration are part of the main problem. Certainly, I doubt that the reasons are so devilish as the ones attributed to Bjarne Stroustrup for designing C++, a hoax that has been around for a while, you can read an article explaining about it here.
As technology has become accessible to more people, the number of programming languages has continued growing and my perception is that the pace is faster too. In some cases, few of them, languages contribute new paradigms to address new scenarios, in other cases, my understanding is that the claim is speed of development for the most common tasks in a certain context.
If you are still not convinced of the magnitude of the issue, here you can find a more visual and compelling representation up to 2004, no, it is not obsolete now, it was already obsolete the moment it was published!!!
Good then, SW developers have plenty of tools to choose from. This should be all good, right? Well, it has some advantages true… but to me is more like a Divide and Dillute approach. Many of this programming languages, the ones that reach some popularity achieve so because they manage to have the support of the community that adopt the language and start contributing new developments to the language ( tools, libraries, tutorials, …). But as it happens in many other fields there is the long tail, those languages that do not generate traction. There is a huge effort put whose return is small, except for the people directly involved that in the process have learned a lot. And the drawbacks are significant, because every new programming language implies starting from zero to create all the ecosystem required to be productive using it.
Who knows where computing would be if the open source initiatives (I put aside the commercial efforts) tried to collaborate more often and do it from scratch when a new need is detected. Obviously, the community becomes very efficient when a programming language has gained some visibility and there is some kind of snowball effect that helps to bring a lot of collaboration together. But this efficiency does not happen or is not sought when the programming language is being defined.
Wouldn’t it be a great thing if there was an organization that helped to unify efforts and avoid “duplicates” when someone decides to design or considers there is need for one new programming language?
While this doesn’t happen, and while developers need to be able to deliver in record time to cope with tyrannic deadlines and increasing needs of productivity, it is required to find ways to get the job done. And quite often, the way to do it is to put to good use sites where the knowledge is shared (like stackoverflow.com), where libraries and tools are shared (like sourceforge.net) and many others. The use of this kind of websites in order to help develop software applications suggests me a concept that I have named “online programming” (but better leave it for another post).
Why not then use this same collaborative platforms in order to help aggregate energy, knowledge and initiative in larger initiatives? It is true that there are many things to agree on when it comes to this much larger projects and that the effort on creating the required consensus may be significant but if there is success the result can be very promising, a programming language created by the community. Can you think of a more ambitious computing project? The esperanto of programming languages.