Converting a decision table to a decision tree

How to convert or visualize a decision table to a decision tree graph, is there an algorithm for solving it or software to visualize it?

For example, I want to render my decision table below: http://i.stack.imgur.com/Qe2Pw.jpg

+3


source to share


3 answers


I must say that this is an interesting question.

I don't know the definitive answer, but I would suggest a method like this:

  • use a Karnaugh map to turn your decision table into a minimized boolean function.
  • turn your function into a tree


Let's try to simplify the example and suppose you got a function with Karnaugh (a and b) or c or d

. You can turn this into a tree like:

enter image description here

Source: my own

+3


source


It is of course easier to create a decision table from a decision tree rather than the other way around.

But as I see you can convert the decision table to dataset. Let "Disease" be a class attribute and treat proofs as simple attributes of binary instances. From this, you can easily generate a decision tree using one of the available decision tree induction algorithms like C4.5. Just remember to turn off cropping and reduce the minimum number of object parameters.



During this process, you will lose some information, but the accuracy will remain the same. Take a look at both lines describing D04 disease - the second line is actually more general than the first. A decision tree generated from this data would only recognize the disease in question from attributes E11, 12 and 13, since it is sufficient to properly label the instance.

+1


source


I spent several hours looking for a good algorithm. But I'm happy with my results. My code is too messy to paste here (I can share privately on request, at your convenience), but the general idea is this.

Suppose you have a dataset with some decision criteria and results.

  • Define a tree structure (eg data.tree in R) and create a root "Start" node.
  • Calculate the final entropy of your dataset. If the entropy is zero, you are done.
  • Using each criterion, one by one, how the tree node computes the entropy for all branches created with that criterion. Take at least one entropy of all branches.
  • Branches created using the least (minimum) entropy criterion is your next node tree. Add them as child nodes.
  • Divide your data according to the decision point / tree node found in step 4 and remove the used criterion.
  • Repeat step 2-4 for each branch until all branches have entropy = 0.
  • Enjoy the perfect decision tree :)
+1


source







All Articles