Buffy Posted November 8, 2007 Report Posted November 8, 2007 You know it's funny really, because there are some purists around today who would argue that the only truely compiled programs around these days are DOS 3GL's (where all necessary elements are compiled into the one .exe from their respective libraries).If that were the definition, then everything that has ever existed on Unix (since, oh say, System III or bsd 2.1) is "interpreted" merely because they use shared libraries! I guess there are folks who believe this, but they are what most of the rest of us would call amateurs...And even Wiki stated that the higher level GL's can be compiled into a 3GL language (the original C was written in C).As I said, avoid Wiki's programming information because most of it is worthless, but translation into C was the most common approach for 4GLs in the 70s-90s, although there were many that were interpreted too.The difference between an interpreted program and a compiled program can be discovered by reading the unencrypted .exe file (produced by a compiler or MS VS product) in a word processor. If you see the screen forms and can read the code then you are definitely looking at an interpreted .exe not a compiled one (as in MS VB and MS VFP). This "analysis" can be highly misleading: "Screen Forms" are mostly the text that is displayed on the screen, and can "appear" to be "source code" but its nothing of the sort: its the text strings almost always stored in the same order that they appear on the form. If you are familiar with CLR or even MFC, you'll know that forms are *entirely* code. There are "resources" that describe the text elements and positions, but these are simply fully compiled data structures: no "interpretation of code" is going on at all. In addition, with shared libraries there is textual linking information that allows finding of named libraries via registry, and there is much text associated with the error messages, filenames, paths, identifier tables and such that are involved in the linkage and sometimes debugger information that "look like" source code, but again are not. for (;:hihi: puts("Honest, its compiled!n");Buffy Quote
LaurieAG Posted November 8, 2007 Report Posted November 8, 2007 Was there something in my previous post that was unclear? Hi Buffy, When I completed my Applied Science Computing degree in 1993 :- A 1GL was Machine Code (ones and zeroes)A 2GL was Assembly Language A 3GL was a CompilerA 4GL was an Interpreted program Considering that the Dean of my college actually wrote the Assembly language simulator (in Pascal) for, and taught, the microprocessors course, among others, can you explain to me where he went wrong, apart from 'marketing' reasons? It's quite reasonable to consider a (non compiled) JAVA program as a 5GL because it is a universally interpreted program that is not tied to any particular operating system, all of the others are tied to an OS. If you need portability over many OS's and don't require speed you would chose Java. Quote
Buffy Posted November 8, 2007 Report Posted November 8, 2007 When I completed my Applied Science Computing degree in 1993 :- A 1GL was Machine Code (ones and zeroes)A 2GL was Assembly Language A 3GL was a CompilerA 4GL was an Interpreted programThere is broad agreement only about 1GL-3GL. I'm too lazy to look them up right now, but actually the very first 4GLs (many built for IBM 3X0s), all generated either C or COBOL or PL/1, which were then compiled with a compiler to executable code. Interpreters came later because it was all about "rapid application development," and waiting around for the "translate-compile-link-load-run" cycle was painful. But translating to C is actually easier to write than a full blown interpreter, so that was the first choice (70's and 80's). James Martin first popularized the term, but gave us the rather unhelpful definition at the time of "a language that is ten times more productive than a 3GL." I've met James Martin, he's a very smart guy, but I was never convinced he really had any of this worked out very well because he always came back to that definition. The other major trait that most people point to is "non-procedurality" or "event-driven" structure. This is really more due though to the fact that building window-based applications is inherently non-procedural, and it becomes a more natural way--and more efficient since you don't have to maintain your own event-loop and interrupt handling explicitly in your application--to build such interactive programs. Non-procedural programming originated though with much earlier report generation "languages" that were "specification-oriented." There's no notion of an "event" but the "sequence" of the "application" is implied by the layout of the page and can be positionally reoriented in an automated fashion. In either case, once an "event" is defined, you still end up with procedural code as the definition for the method in question, so arguments about how many fewer "function points" are required to do something specific become very much akin to counting the number of angels that dance on the head of a pin.... With James Martin's wide popularization of the term in conjunction with his vague definitions, we marketing folks kind of went wild with application of the term, coming up with things to differentiate "our" implementation from "the stupid program the competition had":"Ours is a 4gE: an complete development ENVIRONMENT that lets you interactively paint forms/windows!""Theirs takes forever to use: you constantly have to wait for it to recompile!""Theirs is slow: its :eek: interpreted!""Ours is interpreted when you build and compiled when you deploy!""Ours has more events! There's less code to write!""Ours is integrated with [your favorite] CASE tool and generates code from data flow diagrams!""Theirs is just a 4GL, ours is FIVE-GL!" And so on and so forth ad nauseum, resulting in Scott Adams at least partially justified jaundiced view of marketing.It's quite reasonable to consider a (non compiled) JAVA program as a 5GL because it is a universally interpreted program that is not tied to any particular operating system, all of the others are tied to an OS. That just says the virtual machine is ported to many systems, which is a deployment issue, not one having to do with ease of programming (no different from a programming viewpoint than having native compilers on all systems). In addition to there being no even modestly accepted definition of 5GL, there's no difference between Java--which is really just a very strongly typed C++--and a bunch of other languages that are generally accepted to be 3GLs. The differences that are really important in development today have to do with the libraries and resources that go along with specific "platforms," by which most of us mean "window systems" (if you're into *nix, or "Windows" and "Mac" Operating Systems if you're not). But this has nothing to do with the "generation" of the "language" and most folks who work with developing languages and compilers today just plain ignore the "generation" issue. There are many more interesting discussions that you can have about the benefits of various languages, and I personally find this particular "competition" kind of silly, since the two are so darn similar. Now if you want *interesting* get me going on Lisp or Smalltalk. Or ask CraigD about MUMPS! :) When I read commentary about suggestions for where C should go, I often think back and give thanks that it wasn't developed under the advice of a worldwide crowd, :hihi:Buffy Quote
LaurieAG Posted November 9, 2007 Report Posted November 9, 2007 The other major trait that most people point to is "non-procedurality" or "event-driven" structure. This is really more due though to the fact that building window-based applications is inherently non-procedural, and it becomes a more natural way--and more efficient since you don't have to maintain your own event-loop and interrupt handling explicitly in your application--to build such interactive programs. Hi Buffy, It's true that for all the GL's up to 4, they were mainly DOS based programs. While Clipper provided you with the code to generate your own standalone DBU.exe (data base utility), Dbase provided them as part of the development environment (but didn't provide a compiler). In this respect Clipper was a 3GL and Dbase was a 4GL. When OOP (Object Oriented Programming) came into being with its class structures and object inheritance, you could see how this was different from all of the others, especially if you could see the event loop in action. Some of the early 16 bit Windows development environments would show you exactly how an even loop operated in procedural code, without having classes and objects as such. And some of these 'dinky' 16 bit programs could operate 1,000 percent faster than their class object relatives, mainly because they used cut down models (that don't include the kitchen sink) with a very clean and efficient event loop (with a minute executable and runtime). When you want to create a fast and efficient game engine you must create your own from scratch (unless you license someone elses i.e. like Duke Nukem 3D and remodel it, not that much different from modding), with the attributes you need and without everything else While C++ may be faster than Java, the actual speed optimisation you can get out of C++ depends on how many of the 'default' classes can be reduced so that they only provide what you explicitly want in your program. So, coming around full circle, the speed of your application depends on how tight your .DLL's (Dynamic Link Libraries) are with relation to what your application actually requires. In the case of a true compiler (and an optimised class based application), only those elements of libraries that are required for the execution of the compiled program are included in the executable file (in the case of the optimised application, custom classes), and all other unecessary elements are ignored. Quote
Qfwfq Posted November 12, 2007 Report Posted November 12, 2007 Do you have anything further to contribute to this discussion Q or do you have some other sort of agenda?I'm certainly not interested in a pissing contest, thanks, and it's no use when you simply don't get your facts straight. Don't suggest that I'm the one who's been dragging the thread of the point. :weather_snowing: Quote
Buffy Posted November 12, 2007 Report Posted November 12, 2007 It's true that for all the GL's up to 4, they were mainly DOS based programs. While Clipper provided you with the code to generate your own standalone DBU.exe (data base utility), Dbase provided them as part of the development environment (but didn't provide a compiler). In this respect Clipper was a 3GL and Dbase was a 4GL.Um. DOS? Most of the history I'm referring to here is much bigger than the "IBM PC Clone"...most of the interesting stuff was done on MVS/CICS, System V and BSD-based Unixes (are you old enough to remember Pyramid, CCI and Sequent?), VMS, or AOS. If you want to compare, take a look at some of the programming environments ("4GL" hadn't been coined yet) that Cullinet, CCA and Cincom built for their own proprietary network database packages (or worked on IMS): neither dBase or Clipper could hold a candle to these things that predated them by nearly a decade! Then contemporary to some of the early DOS tools were all the relational 4GLs from Oracle, Ingres, Informix, Sybase, Powerbuilder, etc. the vast majority of which started as tools to build forms-based applications for dumb terminals, but migrated to window/GUI support with much the same infrastructure and language elements.When OOP (Object Oriented Programming) came into being with its class structures and object inheritance, you could see how this was different from all of the others, especially if you could see the event loop in action.Manually built event loops are boring and have nothing at all to do with whether the language is object-oriented or not. The Microsoft Foundation Classes (MFC) that grew under Versions 3-8 of MSC++ was obviously OO, but you had to manage the event loop yourself (something that CLR does much more cleanly). A built-in event loop was de riguer under most of the above mentioned 4GLs as a key tool for reducing the amount of code, but few were "extensible" as is now common: you just had the events the system gave you. Its interesting to note too that this event-driven orientation went back to even the mainframe 4GLs--that were definitely not OO--BECAUSE of the fact that block-mode terminals under CICS were "event-driven" by definition! You could get a lot more interactive with the serial terminals because of the fact that keypresses caused interrupts that could be caught and would trigger events in the 4GL if it supported them (Oracle and Ingres both did that).Some of the early 16 bit Windows development environments would show you exactly how an even loop operated in procedural code, without having classes and objects as such. And some of these 'dinky' 16 bit programs could operate 1,000 percent faster than their class object relatives, mainly because they used cut down models (that don't include the kitchen sink) with a very clean and efficient event loop (with a minute executable and runtime). Without reference to what specific tools you're talking about here, its hard to respond, but it should be noted that Clipper had a p-code mode, while dBase was straight interpretation of the text code, and that's why Clipper was so much faster than dBase. Both--using the definitions from earlier posts though--were interpreted and not compiled. On an 8088, everything is slow! :)While C++ may be faster than Java, the actual speed optimisation you can get out of C++ depends on how many of the 'default' classes can be reduced so that they only provide what you explicitly want in your program.This is incorrect: while using fine-grained/custom linking *will* speed up load times if you are building a monolithic executable, it will have no noticeable affect on runtime speed. In the really old days, you'd be cursed with limited memory and swapping from page files (are you old enough to remember LIM and manual overlay support? :warped: ), but today that rarely happens: you're much more likely to force *data* into page files than code. So,the speed of your application depends on how tight your .DLL's (Dynamic Link Libraries) are with relation to what your application actually requires. In the case of a true compiler (and an optimised class based application), only those elements of libraries that are required for the execution of the compiled program are included in the executable file (in the case of the optimised application, custom classes), and all other unecessary elements are ignored.The DLLs get loaded and are rarely swapped out or if they are, they're the *unused pages* of the libraries; that is its the operating system that can intelligently figure out what's not being used! The downside of custom DLLs is that the linkage process is nightmarish: you need to know what all the dependencies are and getting something to link properly takes forever. MS used to do it, and so did all the standard Unix libraries, but it simply is not worth it anymore. Just to reiterate: having large libraries does not slow down execution: the compiled application knows exactly what the offsets are to functions within the libraries and its built into the compilation, to the point that under MSC++ and most other C++ implementations as well, you need to relink when you recompile, whereas standard Java *compilers* let you do dynamic linking (it will look up offsets at runtime, which is again, a startup cost since the offsets are then cached). Hope this clarifies the technology a bit more... Sometimes I get my verb tenses mixed up, :)Buffy Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.