Big-Oh is a way to describe the efficiency of an algorithm in the worst-case scenario. It is determined by rate of the growth of the number of steps which an algorithm takes as the input increases. Big-Oh can be categorized anywhere from constant (O(1)) to exponential (O(n^n)). The symbol for Big-Oh can be confusing at first because it does not allow the usage of a constant multiple, as it is already implied. If the algorithm grows at a rate of 3n, then it is included in the case for O(n). O(n) means that the algorithm grows at a rate of any constant multiplied by n. The goal of measuring Big-Oh is to see how well your algorithm performs under the worst-case scenario then to determine where it is possible to improve your algorithm.
The worst-case scenario efficiency of these sorting algorithms are O(n^2), for selection, insertion, bubble and quick sort and O(nlogn), for merge and Timsort. Although quick sort has a quadratic Big-Oh for worst-case scenario, in real world applications it is incredibly fast. http://www.sorting-algorithms.com/ has a great visualization of the different sorting algorithms (excluding Timsort). Here you can see quick and merge sort are much faster than insertion, selection and bubble sort. The only list where quick sort appears to have trouble is lists that have few unique items. Robert Sedgewick and Jon Bentley noticed that quick sort has these sub-optimal performances and decided to make some tweaks to the algorithm and managed to optimize performance in quick sort. The adjustments are posted in this presentation: http://www.sorting-algorithms.com/static/QuicksortIsOptimal.pdf.
What I observed with the visualizations, I also observed in lab where we had to compare how all of these sorting algorithms with inputs of 400, 800, 1200, 1600, 2000, and 2400.