An Algorithmic and Social Introduction to Computer Science (CSC-105 2000S)

Computer programmers often emphasize efficiency as the primary criterion for evaluating algorithms. What effects do you think this emphasis has?


When computer programmers focus on efficiency of an algorithm, they take into considerations such factors as memory and the time it takes to execute the program. In focusing in efficiency, computer programmers may neglect other factors in choosing an algorithm, such as simplicity and clarity. One factor that is not always considered is the next programmer. There is that assumption that all programmers will understanding the previous programmers work.


I think that the emphasis on efficiency can have the effect of glossing over problems and limiting applications. As far as problems go, I think that the whole "Y2K" thing is a good example, using two digits for a year might have made programs more efficient, but ended up causing problems too (and giving computer programmers jobs to fix them, which maybe was the point to begin with). When humans strive to do things faster and faster, errors inevitably result, and I can't help but think that it is this way with the things we create as well.

I also think that the focus on efficiency limits the possibilities and creativity in creating algorithms and programs. If you want to do something fast you have to do it very simply and on a very small scale. I'm thinking about artificial intelligence, or computers that learn, that kind of thing...perhaps they are not extremely efficient, but maybe tolerating a little inefficiency now would create more efficiency later.


If efficiency, or running time, is the prime consideration when evaluating an algorithm, i'm assuming that other, arguably equal or more pressing, concerns are being overlooked, such as accuracy or potential bugs in the program. you always say that one should always account for the worst case scenario in programming but it seems that that advice does not comfortably sit well with the motive of finding the easiest way out. so my instinct is to apply general knowledge to this question and say that the fastest or easiest way to do something is often not the best way to do it, and programming might benefit from evaluating algorithms from a broader and more holistic perspective.


Computer users would hope that this emphasis would ensure quick programs which accomplish tasks with few errors. I think, to a large extent, these effects are accurate. Efficiency demands that we take the fewest number of steps to accomplish a task accurately, so that the computer can follow the algorithm easily, and so that there is no question about the process. However, to make algorithms operate faster, more abstract, complex thinking is often required, and so it may be more difficult to translate human thought into simple instructions that work in all cases. The quest for efficiency also seems to manifest itself in some dearly stupid ways in computer overactive grammar autocorrection. It seems like the computer can be programmed to operate efficiently, however so efficiently it goes too far. But this is not related directly to the question at hand. Computers can only process numbers quickly, it seems, so the quest for efficiency might entail a large amount of programming work/code behind successful programs (to produce quick results via codified data/broken into elements/arrays/etc.), which makes the need for computer memory more critical. Though, we know that there seems to be a mathematical limit to the achievement of efficiency on a computer because if it's limited options in abstract design (just 1's and 0's after all), it will always be the objective by which we evaluate fruition in computing. Moreover, because the quest for efficiency is highly mathematical and is based on the need to consume less time per operation, that applications discovered in this framework might be limited by focusing on speed and number of steps instead of the future applications that could stem from broader considerations in the outset of programming a computer with an algorithm.

Hope all that makes sense, but those are my thoughts about this question for now.


In emphasizing efficiency, I'm thinking that the actual quality of the execution is one thing that may be sacrificed. An algorithm may be very fast, but if it doesn't do as good a job at, let's say, sorting information or lists, as a slower one, then efficiency doesn't do you much good. It seems like the efficiency of an algorithm needs to be task-specific, or looked at in terms of how successful it is at whatever task actually needs to be done.

As far as readability goes, I don't know if computers are able to read and execute certain algorithms better than others or not (better in terms of possible glitches that could occur when doing certain functions). But if this is the case, then algorithms that might be faster but are also more likely than slower ones to create some sort of problem for the user probably wouldn't be as favorable as one might think.


1. Efficient programming language reduces the speed it takes for computers to execute commands.

2. Efficient programming language enables very large algorithms to be computed. If the language was not efficient then it might take too long for any computer to do the algorithm.

3. Efficient programming language can be more complicated to create, change, or understand.

4. Some programming languages are great innovations in computer science and should not be dismissed if they are not efficient. Efficiency should be considered a tie-breaker for the utility of an algorithm, not an all determining factor.

5. Too much emphasis on efficiency might overlook less efficient solutions to currently unsolved issues.


Well, first of all, by efficiency I'm assuming that it means time and memory required to run the program. Maximizing efficiency would cause programmers to be cautious and aware of the kind of language they use to express their algoriths, hopefully knowing the way that would take the least amount of time and memory. If it was difficult to know or estimate this, many experimental trials would have to be run to test the efficiency of different programs. So I think a lot of it would be trial and error until they established a basic set of rules to follow in order to maximize efficiency. I think this is the way to go, though, be efficiency should be the number one criterion for evaluating algorithms, I think.


Efficiency is great, and can often be very useful with algorithms that solve the same problem. However, efficiency can also mean that a programmer has spent less time concerned with contingency engineering of the algorithm and more with efficiency, and so if there are exceptions or different cases which may affect the algorithm's performance, they can be overlooked. Also, emphasizing efficiency can also lead to the algorithm's not being flexible enough to deal with other problems that a less efficient algorithm can solve easily. Also, in the case of Microsoft products, the algorithm may sometimes be faulty and not always work when its working is necessary. (mandatory Microsoft attack of the day)


I think whenever you are designing anything, you have to realize that there are many aspects of the design that are equally important. If one gets emphasized over another, it can very well lead to a decline in quality of the other elements.

So, while it is good to have efficient algorithms, speed is not the only thing that they have to accomplish. They also have to be functional in all situations-able to deal with complications as they arise. They should also be able to process difficult or unusual data and/or results. They should not be impossible for the user to understand what is being done. Results should also be presented in a clear manner.


If the constant search for the fastest programming and problem solving is the priority, it seems like there has to be some kind of trade off. I would suppose that this lies in the accuracy with which with these programs work, but computer technolgies seem to navigate this issue. If there are ways to consistently improve the speed at which computers perform, there doesn't seem to be a huge problem. But what happens when the maximum output is achieved? The competition to make a faster computer is fierce now, and is sure to keep advancing the possibilities. I guess I'm left with a question, wondering what the trade offs are for high speed computing?

Disclaimer Often, these pages were created "on the fly" with little, if any, proofreading. Any or all of the information on the pages may be incorrect. Please contact me if you notice errors.

This page may be found at

Source text last modified Sun Feb 13 09:55:46 2000.

This page generated on Wed Feb 16 08:36:14 2000 by Siteweaver. Validate this page's HTML.

Contact our webmaster at