How do I calculate the average height of a binary tree?

The average height of a binary tree is equal to the sum of the height of each node divided by the shared nodes.

However, how do you calculate the sum of each node's height? Do you need to include residual nodes as well? And, is the height of the leave node = 1?

A thorough explanation with examples would be great!

For example, with the binary tree shown below:

       1
   2       9
0   3    4   10

      

Annas should be 1.4285 Starting from the residual nodes, I get a total of: 1 + 1 + 1 + 1 + 2 + 2 + 3 = 11

Thus, the average height avg is = 11/7, which is not 1.4285. What did I do wrong here?

+3


source to share


1 answer


You already figured it out, but for further reference, here's the code to calculate the average height in c#

which uses recursion:



namespace AverageHeight
{
    public class tree
    {
        public tree left;
        public tree right;
    }

    class Program
    {
        static int count = 1;

        static void Main(string[] args)
        {
            var tree = new tree()
            {
                left = new tree() { left = new tree(), right = new tree() },
                right = new tree() { left = new tree(), right = new tree() }
            };

            var avg = (decimal)Height(tree, 0) / count;
        }

        static int Height(tree t, int l)
        {
            if (t.left != null) count++;
            if (t.right != null) count++;

            return l + (t.left != null ? Height(t.left, l + 1) : 0) +
                       (t.right != null ? Height(t.right, l + 1) : 0);
        }
    }
}

      

+1


source







All Articles