Omnimaga
General Discussion => Other Discussions => Math and Science => Topic started by: fb39ca4 on August 08, 2011, 02:24:07 am
-
What would you say the best sorting algorithm is for a list of around 30-50 values?
-
What type of data is being sorted? What order do the data need to be sorted in? How much memory is available? Will the data be fairly well ordered or generally random? What language is this in? Is speed more important than memory usage?
Without that information, I'd say use Quicksort or Flashsort if the data set is expected to fit their best/average case scenarios.
-
It is going to be for sorting the sprites in my raycasting engine from far to near. It is on the nspire so memory usage is not a concern, I'd rather go for speed. The data will be quite random, and I am making this in C.
-
Flashsort would probably be a pretty good choice if you have a lot of sprites (>80).
http://www.neubert.net/Flacodes/FLACodes.html#Sprung3 (http://www.neubert.net/Flacodes/FLACodes.html#Sprung3)
Use Quicksort otherwise.
-
I was also looking at combsort, it seems to be very simple to implement, so I may do that, and then see if I need a faster algorithm.
-
If you don't need a faster algorithm use that. However, looking at the wikipedia page turns up division by a floating point number, which is likely to be slow.
-
I can use a predefined table for the gap if need be, but I am already doing over 1500 floating point divides per frame :P.
-
When implementing a quicksort, the comparison function needs to be inlined, otherwise performance can be reduced significantly. IOW, don't use qsort from stdlib.h.
For several dozens of items, shell sort usually has good performance and pretty simple code: http://embeddedgurus.com/stack-overflow/2009/03/sorting-in-embedded-systems/
The library in TIGCC has been using Shell sort for qsort() from the beginning; but the implementation used the original, stupid sequence of gaps, so I improved it for GCC4TI.
-
If you don't need a faster algorithm use that. However, looking at the wikipedia page turns up division by a floating point number, which is likely to be slow.
Inspired by this topic I made my first sorting algorithm, here's some C code:
#include <stdio.h>
int main();
int *sort(int array[], int array_length);
int check_if_num_in_array(int num, int array[], int array_length);
int main()
{
/* Tests sort() with a sample case */
int num_array[] = {5, 10, 12, 1, 4, 1000};
int num_array_length = sizeof(num_array) / sizeof(int);
int b;
for (b = 0; b < num_array_length; b++)
{
printf( "%d\n", sort(num_array, num_array_length)[b] );
}
return 0;
}
int *sort(int array[], int array_length)
{
/* Sorts array[] of length array_length */
int *ptrArray;
int sorted[array_length];
ptrArray = sorted;
int i; //Big loop
int u; //Small loop
int maximum = 0;
for (i = 0; i < array_length; i++)
{
for (u = 0; u < array_length; u++)
{
/* If bigger than current maximum and not in sorted[] already */
if ( array[u] > maximum && check_if_num_in_array(array[u], sorted, i) ==0)
{
maximum = array[u];
}
}
sorted[i] = maximum;
maximum = 0;
}
return ptrArray;
}
int check_if_num_in_array(int num, int array[], int array_length)
{
/* Checks if number num is in array[] of length array_length */
int v;
for (v = 0; v < array_length; v++)
{
if (array[v] == num)
{
return 1;
}
}
return 0;
}
The main(); function is just there as an example. Can anybody tell me of how fast it is compared to more famous ones? Thanks!
EDIT: Yeah I prototype main().