Wednesday, January 26, 2011
Saturday, January 15, 2011
Algorithmic Efficiency -- Beating a Dead Horse Faster
In computer science, often the question is not how to solve a problem, but how to solve a problem well. For instance, take the problem of sorting. Many sorting algorithms are well-known; the problem is not to find a way to sort words, but to find a way to efficiently sort words. This article is about understanding how to compare the relative efficiency of algorithms and why it's important to do so.
If it's possible to solve a problem by using a brute force technique, such as trying out all possible combinations of solutions (for instance, sorting a group of words by trying all possible orderings until you find one that is in order), then why is it necessary to find a better approach? The simplest answer is, if you had a fast enough computer, maybe it wouldn't be. But as it stands, we do not have access to computers fast enough. For instance, if you were to try out all possible orderings of 100 words, that would require 100! (100 factorial) orders of words. That's a number with a 158 digits; were you to compute 1,000,000,000 possibilities were second, you would still be left with the need for over 1x10^149 seconds, which is longer than the expected life of the universe. Clearly, having a more efficient algorithm to sort words would be handy; and, of course, there are many that can sort 100 words within the life of the universe.
Before going further, it's important to understand some of the terminology used for measuring algorithmic efficiency. Usually, the efficiency of an algorithm is expressed as how long it runs in relation to its input. For instance, in the above example, we showed how long it would take our naive sorting algorithm to sort a certain number of words. Usually we refer to the length of input as n; so, for the above example, the efficiency is roughly n!. You might have noticed that it's possible to come up with the correct order early on in the attempt -- for instance, if the words are already partially ordered, it's unlikely that the algorithm would have to try all n! combinations. Often we refer to the average efficiency, which would be in this case n!/2. But because the division by two is nearly insignificant as n grows larger (half of 2 billion is, for instance, still a very large number), we usually ignore constant terms (unless the constant term is zero).
Now that we can describe any algorithm's efficiency in terms of its input length (assuming we know how to compute the efficiency), we can compare algorithms based on their "order". Here, "order" refers to the mathematical method used to compare the efficiency -- for instance, n^2 is "order of n squared," and n! is "order of n factorial." It should be obvious that an order of n^2 algorithm is much less efficient than an algorithm of order n. But not all orders are polynomial -- we've already seen n!, and some are order of log n, or order 2^n.
Often times, order is abbreviated with a capital O: for instance, O(n^2). This notation, known as big-O notation, is a typical way of describing algorithmic efficiency; note that big-O notation typically does not call for inclusion of constants. Also, if you are determining the order of an algorithm and the order turns out to be the sum of several terms, you will typically express the efficiency as only the term with the highest order. For instance, if you have an algorithm with efficiency n^2 + n, then it is an algorithm of order O(n^2).
If it's possible to solve a problem by using a brute force technique, such as trying out all possible combinations of solutions (for instance, sorting a group of words by trying all possible orderings until you find one that is in order), then why is it necessary to find a better approach? The simplest answer is, if you had a fast enough computer, maybe it wouldn't be. But as it stands, we do not have access to computers fast enough. For instance, if you were to try out all possible orderings of 100 words, that would require 100! (100 factorial) orders of words. That's a number with a 158 digits; were you to compute 1,000,000,000 possibilities were second, you would still be left with the need for over 1x10^149 seconds, which is longer than the expected life of the universe. Clearly, having a more efficient algorithm to sort words would be handy; and, of course, there are many that can sort 100 words within the life of the universe.
Before going further, it's important to understand some of the terminology used for measuring algorithmic efficiency. Usually, the efficiency of an algorithm is expressed as how long it runs in relation to its input. For instance, in the above example, we showed how long it would take our naive sorting algorithm to sort a certain number of words. Usually we refer to the length of input as n; so, for the above example, the efficiency is roughly n!. You might have noticed that it's possible to come up with the correct order early on in the attempt -- for instance, if the words are already partially ordered, it's unlikely that the algorithm would have to try all n! combinations. Often we refer to the average efficiency, which would be in this case n!/2. But because the division by two is nearly insignificant as n grows larger (half of 2 billion is, for instance, still a very large number), we usually ignore constant terms (unless the constant term is zero).
Now that we can describe any algorithm's efficiency in terms of its input length (assuming we know how to compute the efficiency), we can compare algorithms based on their "order". Here, "order" refers to the mathematical method used to compare the efficiency -- for instance, n^2 is "order of n squared," and n! is "order of n factorial." It should be obvious that an order of n^2 algorithm is much less efficient than an algorithm of order n. But not all orders are polynomial -- we've already seen n!, and some are order of log n, or order 2^n.
Often times, order is abbreviated with a capital O: for instance, O(n^2). This notation, known as big-O notation, is a typical way of describing algorithmic efficiency; note that big-O notation typically does not call for inclusion of constants. Also, if you are determining the order of an algorithm and the order turns out to be the sum of several terms, you will typically express the efficiency as only the term with the highest order. For instance, if you have an algorithm with efficiency n^2 + n, then it is an algorithm of order O(n^2).
Makar Sankranti
Makar Sankranti, the festival of harvest in India, is celebrated from 14 January 2011 to 16 / 17 January. Hindupad wishes you all a Happy Makar Sankranti. Today (14 January 2011) is Bhogi.. the day of bonfire (bhogi mantalu). In Andhra Pradesh, Karnataka, and Maharashtra, Bhogi is celebrated with utmost pomp. It is said on this day Lord Sri Ranganatha married Goddess Goda Devi. (Note: on 15 January 2011 (Sankranti), wearing red sarees or dresses is auspicious and the special food item (naivedyam) to offer to Sun God is – Kheer (Payasam).
Apart from Bhogi mantalu, Bhogi Pallu is another famous ritual on Bhogi (In some places, it is also observed on Sankranthi). ‘Bhogi pallu’ means the gooseberry fruits along with some food items, rice, other grains, coins, are kept on heads of children and they are offered to their maids as Drishti perantam. It is believed that doing such can make the children healthier, happier, and live longer. Makarajyothi festival is held during Makara Sankramana in Sabarimala Ayyappa temple. (14 January 2011 at 6.30 pm).
On Sankranti (15 January 2011) day, Sun enters into Makara Rashi. It marks the beginning of Uttarayana punyakalam. Performing punya snana (holy dip) in holy rivers such as Ganga, Yamuna, Godavari, Cauvery, etc., is highly auspicious. Lord Surya is offered special puja and special naivedyam on this day. Gangasagar Mela is held during Sankranthi.
Kanuma (16 January 2011) is the third and concluding day of Makar Sankranti festival (In fact the fourth day of Sankranti is celebrated as Mukkanuma… but it is not much popular). On Kanuma day, cattle (ox, cows, buffalos, etc.) are decorated and worshipped. In some places, Lord Shiva and Goddess Parvati; Lord Narayana and Goddess Lakshmi are worshipped on Kanuma.
Apart from Bhogi mantalu, Bhogi Pallu is another famous ritual on Bhogi (In some places, it is also observed on Sankranthi). ‘Bhogi pallu’ means the gooseberry fruits along with some food items, rice, other grains, coins, are kept on heads of children and they are offered to their maids as Drishti perantam. It is believed that doing such can make the children healthier, happier, and live longer. Makarajyothi festival is held during Makara Sankramana in Sabarimala Ayyappa temple. (14 January 2011 at 6.30 pm).
On Sankranti (15 January 2011) day, Sun enters into Makara Rashi. It marks the beginning of Uttarayana punyakalam. Performing punya snana (holy dip) in holy rivers such as Ganga, Yamuna, Godavari, Cauvery, etc., is highly auspicious. Lord Surya is offered special puja and special naivedyam on this day. Gangasagar Mela is held during Sankranthi.
Kanuma (16 January 2011) is the third and concluding day of Makar Sankranti festival (In fact the fourth day of Sankranti is celebrated as Mukkanuma… but it is not much popular). On Kanuma day, cattle (ox, cows, buffalos, etc.) are decorated and worshipped. In some places, Lord Shiva and Goddess Parvati; Lord Narayana and Goddess Lakshmi are worshipped on Kanuma.
Monday, January 10, 2011
Subscribe to:
Posts (Atom)