<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Matt's Engineering Blog]]></title><description><![CDATA[Matt's Engineering Blog]]></description><link>https://mcochrane.com/</link><generator>Ghost 5.2</generator><lastBuildDate>Tue, 30 Sep 2025 17:55:16 GMT</lastBuildDate><atom:link href="https://mcochrane.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Permutation Generation Algorithms]]></title><description><![CDATA[<h2 id="why-are-permutation-algorithms-important">Why are Permutation Algorithms Important?</h2><p>Generating permutations is a frequently occurring problem in both coding challenges and real-world problems. &#xA0;Any time you encounter the need to enumerate the arrangements of a set of items, you&apos;re considering permutations. &#xA0;</p><p>In addition to that, permutation generation algorithms are an</p>]]></description><link>https://mcochrane.com/permutation-algorithms/</link><guid isPermaLink="false">62be494603124900018d74b3</guid><category><![CDATA[algorithms]]></category><category><![CDATA[math]]></category><category><![CDATA[combinatorics]]></category><category><![CDATA[backtracking]]></category><category><![CDATA[recursion]]></category><dc:creator><![CDATA[Matthew Cochrane]]></dc:creator><pubDate>Mon, 04 Jul 2022 12:39:07 GMT</pubDate><content:encoded><![CDATA[<h2 id="why-are-permutation-algorithms-important">Why are Permutation Algorithms Important?</h2><p>Generating permutations is a frequently occurring problem in both coding challenges and real-world problems. &#xA0;Any time you encounter the need to enumerate the arrangements of a set of items, you&apos;re considering permutations. &#xA0;</p><p>In addition to that, permutation generation algorithms are an important building block in more complicated algorithms. &#xA0;It&apos;s important to have a good understanding of the base algorithm in order to use it in more complicated situations. &#xA0;Enumerating all the permutations of a set of $n$ elements takes $O(n!)$ time, but that&apos;s impractical for reasonable sizes of $n$. &#xA0;Therefore it&apos;s important to be able to reason about permutation generation algorithms to be able to improve the time complexity of any practical algorithm.</p><p>Lastly, permutation generation algorithms are a great example of recursion and backtracking. &#xA0;They are a quintessential application of these techniques and demonstrate some of the tradeoffs of different approaches.</p><h2 id="whats-a-permutation">What&apos;s a Permutation?</h2><p>A permutation is an arrangement of a set of items. &#xA0;For example <code>[1,2,3]</code> , <code>[1,3,2]</code> and <code>[3,1,2]</code> are all permutations of the set <code>{1,2,3}</code>.</p><p>A set of $n$ items has $n!$ distinct arrangements or permutations. &#xA0;A way to reason about this is to think about how many options there are at each position. &#xA0;To generate a permutation of length $n$, there are $n$ options that could be placed in the first slot, $n-1$ options in the second slot, $n-2$ in the third slot, and so on until the last slot which only has one option. &#xA0;Multiplying these choices together gives $n \times (n-1) \times (n-2) &#xA0;\times ... \times 1 = n!$ total options.</p><h2 id="time-and-space-complexity">Time and Space Complexity</h2><p>Generating a list of all $n!$ permutations of a set of $n$ items inherently requires $O(n \times n!)$ space to store all the permutations since there are $n!$ permutations each of length $n$. &#xA0;The time complexity is also $O(n \times n!)$. Therefore, it&apos;s probably not practical to enumerate all permutations in a single list. &#xA0;A more useful interface is to use an iterable that yields the next permutation each time it&apos;s called.</p><pre><code class="language-python">def permutations(items: Collection) -&gt; Iterable[Tuple]:
	...</code></pre><p>This still has a time complexity of $O(n \times n!)$ because it returns immutable tuples (copying each tuple takes $O(n)$ time for each of the $n!$ permutations). &#xA0;This could be improved to $O(n!)$ if we were happy to mutate a list internally and keep yielding the same list but the loss of safety and probability of misuse make it unlikely to be worthwhile. &#xA0;Either way, the space complexity drops to $O(n)$, which is the space required to store the current permutation and the call stack. &#xA0;We&apos;ll see shortly how a naive approach would require $O(n^2)$ space, and how to avoid it.</p><h2 id="existing-functions">Existing Functions</h2><p>The python standard library already includes a number of combinatoric iterators such as <code>combinations()</code>, <code>product()</code> and of course <code><a href="https://docs.python.org/3/library/itertools.html?highlight=permutation#itertools.permutations">permutations()</a></code>.</p><h2 id="a-general-approach">A General Approach</h2><p>We previously concluded that the number of permutations of a set of $n$ items is $n!$, based on having $n$ choices in the first position, $n-1$ choices in the second position, and so on. &#xA0;This lends itself to a recursive solution when generating the permutations.</p><p>$S$ is a set of $n$ items. &#xA0;Every permutation of $S$ begins with one of the values of $S$. &#xA0;Once we pick the first value, there are $n-1$ items left to pick from. &#xA0;These $n-1$ remaining items form a sub-problem. &#xA0;We can then formulate a recurrence relation as such <code>permutations = starting value + permutations(remaining values) for each starting value</code>. &#xA0;The base case occurs when there are no remaining values, in which case we should return a list with an empty list. &#xA0;That is, there is one permutation of <code>[]</code>, which is <code>[]</code>.</p><h2 id="naive-implementation">Naive Implementation</h2><p>Here&apos;s a naive way to implement the <code>permutations()</code> function.</p><pre><code class="language-python">def permutations(items: Collection) -&gt; List[Tuple]:
    if not items:
        return [()]
    results = []
    for start_idx, start_val in enumerate(items):
        remaining_items = items[:start_idx] + items[start_idx + 1 :]
        for permutation_of_remaining in permutations(remaining_items):
            results.append((start_val, *permutation_of_remaining))
    return results
</code></pre><p>This top-down approach has a huge space complexity. &#xA0;If we have a set of n items then the top level call could have up to $n!$ items each of length $n$, then the next call down in the stack could have up to $(n-1)!$ items, each of length n, and so on. &#xA0;This gives a space complexity of $O((n! \times n) + ((n-1)! \times n) + ... + (1 \times n))$ though we note that the first term dominates so the following terms fall away and it becomes $O(n! \times n)$. &#xA0;Even if the final output array is excluded (as is common practice), the space complexity would still be $O((n-1)! \times n) = O(n!)$.</p><p>So how can we improve this?</p><h2 id="backtracking">Backtracking</h2><p>We can improve the space complexity of the approach above by trying to eliminate the internal <code>results</code> list in every sub-call. &#xA0;One way to do this is to keep a list for our current permutation and update it using backtracking. When a leaf node is encountered (when <code>items</code> is empty), copy the current permutation to an overall result array.</p><pre><code class="language-python">def permutations(all_items: Collection) -&gt; List[Tuple]:
    results = []
    current_permutation = []
    def find_permutations(items: Collection) -&gt; None:
        if not items:
            results.append(tuple(current_permutation))
        for i, item in enumerate(items):
            current_permutation.append(item)
            find_permutations(items[:i] + items[i+1:])
            current_permutation.pop()
    
    find_permutations(all_items)
    return results
            </code></pre><p>But the space complexity is still $O(n! \times n)$ because the <code>results</code> array contains every permutation. &#xA0;So let&apos;s return an iterable instead of a list. &#xA0;Here it&apos;s done using a recursive generator and <code>yield from</code>.</p><pre><code class="language-python">def permutations(all_items: Collection) -&gt; Iterable[Tuple]:
    current_permutation = []
    def find_permutations(items: Collection) -&gt; Iterable[Tuple]:
        if not items:
            yield tuple(current_permutation)
        for i, item in enumerate(items):
            current_permutation.append(item)
            yield from find_permutations(items[:i] + items[i+1:])
            current_permutation.pop()
    
    yield from find_permutations(all_items)</code></pre><p>This reduces the space complexity to $O(n^2)$ because we&apos;re making a copy of the items at each recursive call to <code>find_permutations</code>.</p><p>Again, how can we do better?</p><h2 id="the-swapping-method">The &apos;Swapping&apos; Method</h2><p>Often when rearranging arrays, there&apos;s an approach that involves swapping elements instead of copying chunks of the array. &#xA0;In the code below, we iterate through the permutations by swapping one element with another, before recursing, then swapping it back. &#xA0;This is equivalent to how the previous method fixed the first value, then passed the remaining elements into the recursive call.</p><pre><code class="language-python">def permutations(all_items: Collection) -&gt; Iterable[Tuple]:
    items = [*all_items] # make a copy so we don&apos;t mutate the original
    def find_permutations(start_idx: int) -&gt; Iterable[Tuple]:
        if start_idx == len(items) - 1:
            # Skip swapping the last item with itself and yield here
            yield tuple(items)
        for i in range(start_idx, len(items)):
            items[start_idx], items[i] = items[i], items[start_idx]
            yield from find_permutations(start_idx + 1)
            items[start_idx], items[i] = items[i], items[start_idx]
    
    yield from find_permutations(0)</code></pre><p>This approach has a space complexity of $O(n)$ for the <code>items</code> list and for the call stack. &#xA0;Its time complexity is the same as the previous approach $O(n \times n!)$.</p><h2 id="bonus-bit-manipulation">Bonus: Bit Manipulation</h2><p>Another way to achieve $O(n)$ space complexity is to use a bitmask to represent which values have already been seen. &#xA0;Note that this will only have $O(n)$ space complexity up to n=64 on a 64-bit architecture.</p><pre><code class="language-python">def permutations(all_items: Collection) -&gt; Iterable[Tuple]:
    current_permutation = []
    complete_mask = (1 &lt;&lt; len(all_items)) - 1
    def find_permutations(used_mask: int) -&gt; Iterable[Tuple]:
        if used_mask == complete_mask:
            yield tuple(current_permutation)
        for i, item in enumerate(all_items):
            if used_mask &amp; (1 &lt;&lt; i):
                continue
            current_permutation.append(item)
            yield from find_permutations(used_mask | (1 &lt;&lt; i))
            current_permutation.pop()
    
    yield from find_permutations(0)</code></pre><h2></h2>]]></content:encoded></item></channel></rss>