stereologist Posted May 27, 2009 Report Posted May 27, 2009 It was fairly clear to me that the problem was a "counting issue." Counting issues often lead to claims like Disraeli's who said, "Lies, damned lies, and statistics". For instance, the literature about the number of neurons in our brains has claimed 500 million and up to 500 billion. The broad range was due to counting issues. In counting, step 1 is to decide on what to count, and then step 2 is deciding how to count. Maddog points well to this issue by describing how counting steps does not reflect cost. One solution is to count each step in a weighted manner. Another solution is consider all solutions with a constant ratio to each other as being equal. By this I mean a solution that takes 3n steps is the same as a solution involving 10n steps. But a solution requiring log(n) steps is not equivalent. Quote
CraigD Posted May 27, 2009 Report Posted May 27, 2009 …The word transducer has a meaning which is not archaic. It is a term describing some of the mathematical machines that can be used to describe computations.Or, in the immortal words of Richard O’Brien’s Planet, Schmanet, Janet:Tha’ tranducerWill seduce ‘yaIt’s something you’ll get used toLanguage is mutable. Here, technically, it should have been “transmuter”, but that wouldn’t rhyme with “seduce ya”. Even in computer science (which these lyrics are unambiguously not), one shouldn’t discount the importance of ring and rhyme. Quote
Buffy Posted May 27, 2009 Report Posted May 27, 2009 Let’s define a sub-collection of the collection of all programs, algorithms, as programs that usually end, and produce useful data.Cool, but I'll still argue this is insufficient, since it still allows for my stop-light "algorithm" posted earlier:It's indubitably a program based on endless extant implementations.It produces "useful data" insofar as it prevents crashes between moving hunks of steelIt stops occasionally when the power goes out, or the highway patrol is handling a special event, or it's taken out for service, but even then, the restriction is limited to "usually"... So what are we gonna do about that "usually"? Is it a requirement or not as we define our "consensus terms?" I have opinions of my own -- strong opinions -- but I don't always agree with them, :phones:Buffy Quote
stereologist Posted May 27, 2009 Report Posted May 27, 2009 Buffy I would argue no because your point 3 is that the finite nature of the program is dependent on events not a part of the program. I would also propose that our point 2 is weak since the usefulness is not a part of the output of the program, but rather a secondary consequence of the program. The actual output of the program is signaling in the form of lights of different colors. Your program would be similar to generating 30 Rs, then 4Ys, then 30Gs, and then repeating this process endlessly. A Turing machine could be made to perform this action (although not behaving as a transducer). The output of the Turing machine never ends. It isn't an algorithm. It is a program. The generation of one cycle, 30 Rs 4 Ys and 30 Gs can be an algorithm. I mean the main reason for these definitions is to be able to evaluate a computation. Is it correct? Is it as fast as it is possible to compute the value? Is there a way that uses less memory? Quote
Buffy Posted May 27, 2009 Report Posted May 27, 2009 By your definition then, 90-odd percent of all programs in existence are not algorithms. That's fine, but riddle me this: what is the use of such an exclusive definition? In other words, "what problem are you trying to solve?" An undefined problem has an infinite number of solutions, :phones:Buffy Quote
Qfwfq Posted May 27, 2009 Report Posted May 27, 2009 Perhaps my advice has been carried all to far! The essential point was to exercise a smidgen of elasticity, without each one clinging to their own convention and maybe not even stating which it is. ;) So, it seems dicussion can now proceed. :phones: By your definition then, 90-odd percent of all programs in existence are not algorithms.Here's another reason why rather few programs are algorithms (perhaps much less even than your complaint): We've been neglecting the further fact that an algorithm is understood as acting on an initial input; IOW it's all there, right from the start. Your program can use argv and argc and yet still be an algorithm. It can even be designed to read in from a file which must already be sitting there waiting. But make it interactive and it isn't really an algorithm! Nitpicking: You'll say that one can write a program that reads stdin but then run it with < infile.txt in the command line or put it in a pipeline! Now I don't think we should be talking so much about coding issues (except illustratively) and the likes, hence I also disagree about the language dependence of the number of steps being a problem; a "step" can even be a whole sub-algorithm! It remains that a program which is basically conceived to just sit there till the next user input, HTTP request or RFID event, whatever the heck, can scarcely be called an algorithm. It may however be executing algorithms for each of those, but call it a GUI, call it a service, call it a 'bot or call it a spade. What's that ol' Wirth said? Algorithms + datastructures = programs... Quote
stereologist Posted May 27, 2009 Report Posted May 27, 2009 By your definition then, 90-odd percent of all programs in existence are not algorithms. That's fine, but riddle me this: what is the use of such an exclusive definition? In other words, "what problem are you trying to solve?" At the end of the post before yours I stated: I mean the main reason for these definitions is to be able to evaluate a computation. Is it correct? Is it as fast as it is possible to compute the value? Is there a way that uses less memory? I gave three reasons for separating out algorithms from programs:1. correctness2. speed (measure of time cost)3. resource usage (memory for example) It has been shown that finite state transducers cannot compute everything that a Turing machine can. It has been shown that a push down transducer (transducer based on a PDA) is equivalent to a Turing machine. The definitions restrict the possibilities, but create a series of analyzable mathematical machines. Qfwfq. I don't think you are nitpicking, but rather pointing out how algorithms are used to construct large entities. I like the way you constructed real life examples of how costs vary across steps, and that steps can be nontrivial. Quote
maddog Posted May 27, 2009 Report Posted May 27, 2009 A transducer is a simple device that has an input and converts it to a different form in the output. So a wind mill is a transducer converting the wind into a water pump. A better example is an electric motor converting electricity into motion.I have even a better example: We as humans beings, a member of the animal kingdomcan be transducers as we consume foodstuffs and produce excrement. It is just a differentform.In CS a transducer converts an input into a different form. The mathematical representation is a "recognizer which emits an output string during each move made. The finite transducer is obtained by taking a finite automaton and permitting the machine to emit a string of output symbols on each move."This quote is from p223 of Aho and Ullman.You missed my point. It may be imho that likely in 200 years or so, the use of theword transducer used as you describe will be found out to be totally useless. In thatvein, I will start now by NEVER using it in that way. Excuse me. I will find better wayslike "parse" -- I will find work satisfactorily as it has for the last 30 or so years. :doh:The word transducer has a meaning which is not archaic. It is a term describing some of the mathematical machines that can be used to describe computations.I never said archaic, unless you consider the 50's "archaic" -- why that the time periodof the origin of computer languages like FORTRAN, et al. :D maddog Quote
maddog Posted May 27, 2009 Report Posted May 27, 2009 Cool, but I'll still argue this is insufficient, since it still allows for my stop-light "algorithm" posted earlier:It's indubitably a program based on endless extant implementations.It produces "useful data" insofar as it prevents crashes between moving hunks of steelIt stops occasionally when the power goes out, or the highway patrol is handling a special event, or it's taken out for service, but even then, the restriction is limited to "usually"... So what are we gonna do about that "usually"? Is it a requirement or not as we define our "consensus terms?"I bow to your greater wisdom. The Stop Light is an excellent example. Yet even thealgorithm within (it does have one) is quite simple. (For each side - proportion so as not to have accidents)1. Go = Green.2. Caution = Yellow.3. Stop = Red.4. Return to step 1. The "rules" of when to go to the next state can be tuned and are "usually" a timing sequence. I have opinions of my own -- strong opinions -- but I don't always agree with them, :doh:I love that!!! :D maddog Quote
stereologist Posted May 27, 2009 Report Posted May 27, 2009 You missed my point. It may be imho that likely in 200 years or so, the use of theword transducer used as you describe will be found out to be totally useless. In thatvein, I will start now by NEVER using it in that way. Excuse me. I will find better wayslike "parse" -- I will find work satisfactorily as it has for the last 30 or so years. Qfwfq, I suggest it is time to lock this thread. Apparently, it is pointless to try and introduce someone to topics they were not aware of. BTW, I have been doing this for 40 years. I've been aware of the proper meaning of parse for 40 years. Quote
Qfwfq Posted May 27, 2009 Report Posted May 27, 2009 Qfwfq. I don't think you are nitpicking, but rather pointing out how algorithms are used to construct large entities.I addressed the hypothetical nitpicking myself. :doh: In that vein, I will start now by NEVER using it in that way. Excuse me. I will find better ways like "parse" -- I will find work satisfactorily as it has for the last 30 or so years.Seems you've been misunderstanding: he isn't using it as a synonym for parse, but actually as a way of indicating something that does more than parsing (not just reporting OK or bad syntax), such as a compiler. Quote
maddog Posted May 27, 2009 Report Posted May 27, 2009 Seems you've been misunderstanding: he isn't using it as a synonym for parse, but actually as a way of indicating something that does more than parsing (not just reporting OK or bad syntax), such as a compiler.I commend you Qfwfq for capturing that nuance. Yes, am using in the context of extraction of data from an input stream (be it keyboard, file, camera, sensor, etc).And interpretation of the same. Stereologist, I must admit, I was not formally trained in Computer Science (though I didtake it as a minor), so I may not have taken "all" the courses as required. My majororiginally was Astrophysics and transferring to Physics when I graduated (I did take all therequired Astrophysics courses though). I learned the bulk of my experience in CS on-the-jobin Aerospace/Defense. 40 years would put you at just before we landed on the moon, the HP-35C had not yetcome out yet and no actual implication of the existence of Black Holes had not yet beendiscovered. The language C did not even exist yet (except as research at ATT Bell Labs). The GoF had not even published their book on Design Patterns yet. So in whatcontext was "transducer" ? I guess in AI, Snobol was available then, I am not sure if Lisp was. So please close this thread. We have already beaten this horse to death a few timesalready. :doh: maddog Quote
stereologist Posted May 27, 2009 Report Posted May 27, 2009 The moon landing was 1969, 40 years ago. Fortran, basic, Cobol, RPG, assembler existed. The IBM 1130 was used to simulate the heating of the LEM. Forbin the Colossus project used IBM 620s. C was not around but B and BCPL were. APL, BAL were around. IBM 360 was 1963. I'll bet the early incarnations of Forth were available in 1969. What passes as computer speak is nothing short of ebonics or should I say e-bonics. Quote
Qfwfq Posted May 28, 2009 Report Posted May 28, 2009 OK guys let's try to get over it. I myself would have said that assemblers and compilers could be called translators (yup, just like from German to Russian) but if the term transducer has long been used, lets get on with the topic:I recently learned that algorithm is a word being used without a definite defintion, as seems to be the case of a lot of terms I have been analyzing. Anyway, I am wondering about this passage that algorithms and it seemingly exercised in fields beyond computer programming. Most algorithms are intended to be implemented as computer programs. However, algorithms are also implemented by other means, such as in a biological neural network (for example, the human brain implementing arithmetic or an insect looking for food), in an electrical circuit, or in a mechanical device. There is no reference, anybody able to direct me toward any?May we then say: An algorithm is a procedure. One could say program although it is not necessarily a computer program. It may be defined by constructs such as a flowchart or in a language having constructs such as described in the Böhm-Jacopini theorem. This makes it a deterministic procedure even though it might incorporate making a random choice at some steps (this doesn't make the procedure itself randomly defined).An algorithm has the purpose of finding (one of the) solutions of a given problem problem, when executed for a given case. If it outputs a solution, it must be certain that this is a valid one. If necessary, the problem must specify a range for quantities in order to make it feasible to reach solutions.An algorithm can be performed on an input: a set of data which amounts to specifying a case or instance of the problem. It must also be designed to terminate within finite "time", if necessary outputting that no valid solution has been reached.The term sometimes gets used in ways that slacken the above two points. Quote
maddog Posted May 28, 2009 Report Posted May 28, 2009 Qfwfq, Your four bullets, I feel address a definition of Algorithm well enough with the following proviso: That said Algorithm is already considered a valid one.Restricting Algorithm to finite states is fine enough. Consider the calculation of Primes. The Algorithm is only calculating one Prime at at time. Do so on each one is definitely a finite set of steps no matter how big the Prime is. maddog Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.