Introduction


For this project I have decided to do a demo of Chan's algorithm. I will be focusing on the 2-dimensional case, although Chan's Algorithm achieves \(O(n \log h)\) time complexity in both 2- and 3-dimensional space, where h denotes the number of points on the hull. His algorithm employs two relatively simple convex hull algorithms, the Graham Scan and the Jarvis March. In simple terms, Chan's Algorithm divides the points into subsets, and computes the convex hull of each point set using the Graham Scan. Then, using the Jarvis March, the convex hull of these mini convex hulls is computed by finding tangents between the hulls.


In Detail


Computing the Mini Hulls

The algorithm begins by partitioning the point set. The point set should be divided into subsets of size m, forming \(\left \lceil{n/m}\right \rceil \)subsets. Each subset's convex hull can be computed in \(O(m \log m)\) using the Graham Scan. Therefore, this preprocessing step of computing the mini convex hulls is \(O((n/m)(m \log m))\), or rather \(O(n \log m)\) . The animation below gives a visual idea of what this might look like.

[Click sketch to re-run with a new, randomized point set]

Executing the Jarvis March

Subsequently, the Jarvis March must be performed to find the all-encompassing convex hull of the point set. Typically, the Jarvis March has a time complexity of \(O(n^2)\) - however, in this case, we will be doing a modified Jarvis March:

Finding a single tangent takes \(O(\log m)\) time (bin. search), and therefore finding all the tangents from a single point will take \(O ((n/m) (\log m))\) time.

Most importantly: What is m?

In order to get an overall time complexity of \(O(n \log h)\), the value chosen for \(m\) must be the number of points on the convex hull. Why is this true? As stated earlier, the Graham Scan step takes \(O(n \log m)\) - we get a time complexity of \(O (n\log h)\) if \(m = h\). For the Jarvis March step, we currently have a time complexity of \(O(h \cdot (n/m) (\log m))\), which is equivalent to \(O(n \log h)\) if \(m = h\). Chan's Algorithm states that we can make a guess for \(m\) and if it is insufficient, we can start the algorithm over again with a larger value for \(m\). In his paper (pg. 363) he suggests beginning with \(m = 2^{2^t}\), where \(t = 1\). If the number of wrapping steps exceeds \(m\), restart and increment \(t\).


Visualization


This visualization illustrates the progression of Chan's Algorithm. The method for guessing m is not illustrated well on a data set this small, so I have instead chosen to double the value of m each time. If m exceeds the number of points on the hull, it doubles and re-tries with this new value for m. Click to begin.


Credits


  1. Chan's Paper
  2. Python code for Jarvis March
  3. Lecture notes