In most languages without the niceties of tail-call optimisation, elegant recursive solutions often degrade quickly into solutions which take up extensive amounts of memory; we know creating a new stack frame for each function call is expensive. A typical application for recursive functions is the depth-first search:
def search_graph(graph, name):
    for node in graph:
        if node.name == name:
            yield node
        nodes = search_graph(node.children, name)
        for item in nodes:
            yield node
  There’s a way around this that will need us to rethink the fundamental way which we approach problems. Using a LIFO stack we can refactor our recursive solutions into iterative ones:
def search_graph(graph, name):
    stack = list(graph)
    while stack:
        node = stack.pop()
        if node.name == name:
            yield node
        stack.extend(node.children)
  The way the iterative method works is quite unorthodox:
To implement breadth-first search version of our graph-searching algorithm, our approach is a little different. Instead of using a LIFO stack we must use a FIFO queue:
from collections import deque
def breadth_first(graph, name):
    queue = deque(graph)
    while queue:
        node = queue.popleft()
        if node.name == name:
            yield node
        queue.extend(node.children)
  
  Notice that the semantics and behaviour have changed
  differently just from a few lines of code change. The
  visualisation of a FIFO queue is an excercise left to
  the reader. Note that in this case, using a double-ended
  queue (deque)
  is more efficient since we’re shifting elements from the
  “head” of the array rather than the end.
  
  When there are cyclic references which often blow up
  recursive programs, the recommended approach (from experience)
  is to use a set
  of “seen” elements if your data-structures are hashable.