What method is the best and in your opinion and why? Are there any ways faster than a method based on the “fast sorting” algorithm? If so, then be kind to tell where you can read about it.
Answer 1, Authority 100%
- There is no better way. , if we talk about “reasonable” algorithms and not take into account esoteric type bogosort or intelligent design sort.
SortIn modern languages, different algorithms usually use depending on the size of the input data. That is, on small sizes of arrays
O (n ^ 2)Sort by inserts often turns out to be more efficient than, for example,
O (n log n)Quick Sort.
Naturally, the sorting with
O (n log n)is selected for large sizes.
- can strictly prove that if
sis a sorting algorithm based on building a solutions tree, then
o (n log n)is the minimum possible time of operation of the algorithm
SIn the worst case. And this means that all algorithms like
Mergesort, sorting inserts, etc. Cannot work in time less than
o (n log n).
However, there are sorting that do not use the trees of solutions and work for the linear time with some restrictions. For example, Front sorting.
- From the literature – CORMEN, Introduction To algorithms.
Answer 2, Authority 88%
I’ll just leave it here
Source: Timsort sorting algorithm
Answer 3, Authority 44%
@ Cattle_Hell_name That’s rightly said that the best way is no .
From almost always applicable algorithms Quicksort IMHO The fastest (time O (n * log n)), although (even implemented correctly) occasionally (in practice very rarely ) It may lead to time of order O (n ^ 2).
It requires log n extra memory, i.e. In practice, you can assume that it does not require.
The main disadvantage of Quicksort is unstable (unstable) algorithm.
Sorting is called Sustainable if the order of entries with the same keys after sorting is saved. Obviously, if you sort an array of numbers, the stability of the algorithm is not important (one number 10 from the other 10 is indistinguishable). To sort records (structures), this is not the case (although it depends on the applied problem).
From sustainable sorting (I consider algorithms over time o (n * log n)) IMHO The fastest is the fusion sorting (mergesort ) in its almost simple implementation, requiring N / 2 additional memory.
Similar results (sometimes M.B. Even faster, but usually slower) shows timsort (this is also a kind of merge sorting). It usually requires 30-40% N of additional memory.
Also (taking this opportunity) I want to pay attention to yamsort . Another algorithm and a stable (stable) merge sorting is small (about 6% of the sorted array) of additional memory. He is somewhat slower than TimSort, but when sorting very large arrays (especially in multiplayer systems), when the completed memory causes Paging, this algorithm turns out to be much faster than other sustainable sorting.
this link (in Sourse / README.TXT) describes this algorithm and is Some results of measuring different sorting, as well as the sources of multiple sorting and an example of a program for measuring them.