Depth Comparisons Of All Sorting Algorithms Computer Science Essay

Published: November 9, 2015 Words: 2117

Sorting refers to the operation of rearranging the elements, so they are in increasing order. Suppose, A be a list of n numbers so that in sorting they are in different form:8,4,19,2,7. So in sorting they are: 2,4,7,8,19. It doesn't mean that elements should be in increasing order, sorting may also mean arranging elements in decreasing. We can say that, this is the way in which data are stored for efficient search and retrieval. The data structure is the one-dimensional (linear) array, in which stored elements are numbered with consecutive integers and contents are accessed by these numbers. Many algorithms have been developed for sorting data efficiently; these apply to structures residing in main memory. Now first we to know what is a sorting algorithm: It is an algorithm that puts elements of a list in a certain order. It can also classified by:

Computational complexity (worst, average and best behavior) of element comparisons in terms of the size of the list.

Recursion: Some algorithms are either recursive or non-recursive, while others may be both.

Stability: Stable sorting algorithms maintain the relative order of records with equal keys. If a sorting method is stable, it is also possible to sort multiple times, each time with one sort key.

Types of sorting

Now, some different type of sorting techniques in data structures:

Bubble Sort.

Insertion Sort.

Quick Sort.

Selection Sort.

Heap Sort.

Merge Sort.

Other sorting also there, but the above following techniques are we have to discuss briefly:

How they work through the algorithms?

Use and the comparisons between them.

Functions of sorting techniques:

Bubble Sort:

Bubble sort is a straightforward and simplistic method of sorting data. The algorithm starts at the beginning of the data set. It compares the first two elements, and if the first is greater than the second, then it swaps them.. It then starts again with the first two elements, repeating until no swaps have occurred on the last pass. This algorithm is highly inefficient, and is rarely used.

Insertion Sort:

Insertion sort is a simple sorting algorithm that is relatively efficient for small lists and mostly-sorted lists. It works by taking elements from the list one by one and inserting them in their correct position into a new sorted list.

Quick Sort:

Quicksort is a divide and conquer algorithm which relies on a partition operation: it move all smaller elements before the pivot, and move all greater elements after it. It is an algorithm of the divide and conquers type, i.e. the problem of sorting a set is reduced to the problem of sorting two smaller sets. So, this reduction step can be specify by this example:

Selection Sort:

Selection sort is a sorting algorithm, specifically an in-place comparison sort. It has O(n2) complexity, making it inefficient on large lists. It find the smallest element in the list and put it in the first position. Then find the second smallest element in the list and put it in the second position.

Heap Sort:

Heapsort is a much more efficient version of selection sort. It also works by determining the largest (or smallest) element of the list, placing that at the end (or beginning) of the list, then continuing with the rest of the list, but accomplishes this task efficiently by using a data structure called a heap, a special type of binary tree. Once the data list has been made into a heap, the root node is guaranteed to be the largest(or smallest) element. When it is removed and placed at the end of the list, the heap is rearranged so the largest element remaining moves to the root.

Merge Sort:

Merge sort takes advantage of the ease of merging already sorted lists into a new sorted list. It starts by comparing every two elements (i.e., 1 with 2, then 3 with 4...) and swapping them if the first should come after the second. It then merges each of the resulting lists of two into lists of four, then merges those lists of four, and so on; until at last two lists are merged into the final sorted list.

Comparisons of Sorting Techniques and their use:

The comparison between these sorting techniques can be determined by the algorithms and their complexity.

Algorithm of Bubble Sort:

(Bubble Sort) BUBBLE (DATA, N)

Repeat Steps 2 and 3 for K=1 to N-1.

Set PTR:= 1

Repeat while PTR <=N-K:

If DATA[PTR]>DATA[PTR+1], then:

Interchange DATA [PTR] and DATA[PTR+1].

[End of If structure.]

Set PTR := PTR+1.

[End of inner loop]

[End of Step 1 outer loop.]

Exit.

Complexity:

The time for a sorting algorithm is measured in terms of the number of comparisons, the number f(n) is easily computed. There are n-1 comparisons during 1st pass which place largest element in the last position and n-2 in 2nd pass in the second step which is the second largest element in the next to last position. So, f(n)=n(n-1)/2= n2/2+O(n)= O(n2). The time required to execute the bubble sort algorithm is proportional to n2. It means that bubble sort average case and worst case are both O(n²). The best case of bubble sort is O(n). It has the extra memory 1 and it have the stability quality.

This sorting is mainly used for quadratic equations means multiplying 2 n-digits numbers. In comparisons to other algorithm it's a good it's very easy to understand and it may also be useful for small data sets.

Algorithm of Insertion Sort:

(Insertion Sort) INSERTION(A, N).

Set A[0] := -∞.

Repeat Steps 3 to 5 for K = 2, 3,….,N:

Set TEMP: = A[K] and PTR := K-1.

Repeat while TEMP < A[PTR]:

Set A[PTR + 1] := A[PTR].

Set PTR: = PTR-1.

[End of loop]

Set A[PTR + 1] := TEMP.

[End of Step 2 loop]

Return.

Complexity: The number f(n) of comparisons in the insertion sort is easily computed. The worst case occurs when the array A is in reverse order and the inner loop must use the maximum no. K-1 of comparisons.

f(n)=1+2+………+(n-1)=n(n-1)/2=O(n)2

oFor the average case there will be (K-1)/2 comparisons:

f(n)=1/2+2/2+…..+n-1/2=n(n-1)/2=O(n2).

This is a very slow algorithm when n is very large. It is the algorithm of choice when the data is nearly sorted and when the problem size is small. It is also stable; insertion sort is often used as the recursive base case.

Algorithm of Quick Sort:

(Quicksort) This algorithm sorts an array A with N elements.

QUICK (A, N, BEG, END, LOC)

Set LEFT := BEG, RIGHT := END and LOC := BEG.

Scan from right to left

Repeat while A(LOC) <= A[RIGHT] and LOC != RIGHT:

RIGHT := RIGHT-1.

[End of loop.]

If LOC = RIGHT, then: Return.

If A[LOC] > A[RIGHT], then:

TEMP := A[LOC), A[LOC] := A[RIGHT), A[RIGHT] := TEMP.

Set LOC := RIGHT.

Go to Step 3.

[End of If structure]

Scan from left to right

Repeat while A[LEFT] <= A(LOC) and LEFT!= LOC:

LEFT := LEFT +1

[End of loop]

If LOC = LEFT, then: Return.

If A[LEFT] > A[LOC], then

TEMP := A[LOC], A[LOC] := A[LEFT], A[LEFT] := TEMP.

Set LOC := LEFT.

Go to Step 2.

[End of If structure]

QUICKSORT:

TOP: = NULL.

If N>1, then: TOP: = Top + 1, LOWER [1]: = 1, UPPER [1]: = N.

Repeat Steps 4 to 7 while TOP! = NULL.

Set BEG: = LOWER [TOP], END: = UPPER [TOP], TOP: = TOP-1.

Call QUICK (A, N, BEG, END, LOC).

If BEG < LOC-1, then:

TOP: =TOP + 1. LOWER [TOP]:= BEG, UPPER [TOP] = LOC-1.

[End of If structure.]

If LOC +1 < END, then:

TOP: =TOP+1, LOWER [TOP]:= LOC+1,

UPPER [TOP]:= END.

[End of If structure]

[End of Step 3 loop]

Exit.

Complexity: The time complexity of this sorting algorithm measured by the comparisons to sort n elements. The worst case occurs when the list is already sorted so the f(n) numbers is computed f(n)=n+(n-1)+..+2+1=n(n1)/2=n2/2+O(n)=O(n2). The average case is when f(n)=O(n log n).

Quicksort turns out to be the fastest sorting algorithm. It has a time complexity of O(n log(n)) on the average. In the worst case O(n2) quicksort is as slow as Bubble sort, There are sorting algorithms with complexity of O(n log n) in the worst case, e.g. Heapsort and Merge sort. But on the average, these algorithms are slower than quicksort. Quick Sort is more popular because it depends on it but Merge Sort requires tons of extra memory and has a small hidden constant.

Algorithm of Selection Sort:

This algorithm sorts the array A with N elements:

Repeat Steps 2 and 3 for K=1, 2,….,N-1.

Call MIN (A, K, N, LOC).

Set TEMP := A[K], A[K] := A[LOC] and A[LOC] := TEMP.

[End of Step 1 loop]

Exit.

Complexity: It have also the same complexity as bubble sort in the worst case and the average case i.e. O(n2). Among simple average-case Θ(n2) algorithms, selection sort almost always outperforms bubble sort, insertion sort will therefore usually perform about half as comparisons of selection sort. It always performs Θ(n) swaps, while insertion sort performs Θ(n2) swaps in the average and worst cases.

Algorithm of Heap Sort:

An array A with N elements and this algorithm sorts the elements of A.

INSHEAP (TREE, N, ITEM)

Set N := N+1 and PTR := N

Repeat Steps 3 to 6 while PTR > 1.

Set PAR: = |_PTR/2_|.

If ITEM <= TREE[PAR], then:

Set TREE [PAR] := ITEM, and Return.

[End of If structure]

Set TREE [PTR]:= TREE [PAR].

Set PTR := PAR

[End of Step 2 loop]

Set TREE [I] := ITEM.

Return.

DELHEAP(TREE, N, ITEM)

Set ITEM := TREE[1]

Set LAST := TREE[N] and N := N-1

Set PTR := 1. LEFT := 2 and RIGHT :=3.

Repeat Steps 5 to 7 while RIGHT <= N:

If LAST => TREE[LEFT] and LAST => TREE[RIGHT], then:

Set TREE[PTR] := LAST and Return.

[End of If Structure]

If TREE[RIGHT] <= TREE[LEFT]. Then:

Set TREE[PTR] := TREE[LEFT] and PTR:= LEFT.

Else:

Set TREE[PTR] := TREE[RIGHT] and PTR:= RIGHT.

[End of If structure]

Set LEFT := 2*PTR and RIGHT := LEFT +1.

[END of Step 4 loop.]

If LEFT = N and if LAST < TREE[LEFT], then: Set PTR:= LEFT.

Set TREE[PTR] := LAST.

Return.

HEAPSORT (A, N)

Repeat for J=1 to N-1:

Call INSHEAP(A, J, A[J+1])

[End of Loop]

Repeat while N>1:

Call DELHEAP(A, N, ITEM)

Set A[N+1] := ITEm.

[End of Loop.]

Exit.

Complexity: This algorithm has two phases and there is a complexity for each phase and is applied to array A with n elements:

Phase A: H is a heap. Observe that the no. of comparisons to find the place for new ITEM in H can't exceed the depth of H. H is a complete tree its depth is bounded by log2 m. So by total g(n) comparisons to insert the n elements of A in H is bounded as

g(n)<=n log2 n.

Phase B: The total no h(n) comparisons to delete n elements of A from H requires reheaping time: h(n) <= 4n log2 n.

So the running time to sort the n-element array using Heapsort is proportional to n log2 n , i.e. f(n) = O(n log2 n). So the average and worst case of the algorithm is O(n log2 n).

Algorithms of Merge Sort:

This is the algorithms to sorts the N-elements array A using an auxiliary array B:

MERGE (A, R, LBA, S, LBB, C, LBC)

Set NA:= LBA, NB := LBB, PTR := LBC, UBA:= LBA+R-1, UBB := LBB+S-1.

Return.

MERGEPASS (A, N, L, B)

Set Q: = INT(N/(2*L)), S := 2*L*Q and R := N-S.

Repeat for J= 1, 2, ….,Q:

Set LB: = 1+(2*J-2)*L.

Call MERGE(A, L, LB, A, L, LB+L, B, LB)

[End of loop]

If R<=L, then:

Repeat for J=1, 2… R:

Set B(S+J) := A(S+J)

[End of loop]

Else:

Call MERGE(A, L, S+1, A, R, L+S+1, B, S+1)

Return

MERGESORT (A, N)

Set L := 1.

Repeat Steps 3 to 6 while L<N:

Call MERGEPASS(A, N, L, B)

Call MERGEPASS(B, N, 2*L, A)

Set L := 4*L

[End of step 2 loop]

Exit

Complexity: f(n) denote the number of comparisons needed to sort an n-element array. The complexity of merging, for each pass require at most n comparisons. Accordingly, for both the worst case and the average case: f(n)<= n log n. It is stable sort and have the extra memory of n. It is often the best choice for sorting a linked list: in this situation it is relatively easy to implement a merge sort in such a way that it requires only O(1) extra space, and the slow random-access performance of a linked list makes some other algorithms perform poorly, and others completely impossible.