I have to write a function, reduce
, that operates on trees and return trees. For example:
reduce (B (P (F (P (B A) A)) (O (B (B A))))) = (B (P (B A) (B A))) reduce (F (P (O (B (B A))) (B A))) = (B A) reduce (P A (O (B (B (D A))))) = (P A A) reduce (App (Lam (B (Var 0))) (B (B A))) = (B (B (B A)))
And so on. To be specific, reduce is supposed to normalise a program in a language similar to the Simply Typed Lambda Calculus. Here is part of its definition, which is still incomplete / getting wrong results in some cases.
My problem is: getting the whole definition of "reduce" correctly is really tricky. There are too many variables, recursion points and things to track that it is very tricky for a human to do. But I can easily generate an infinite amount of correct input/output pairs (all I have to do is write a few test programs and what they should output!)
So, the question: is there any way Machine Learning can help me with this problem, somehow? Are there ways to throw a lot of computing power to find the correct implementation of reduce
?
[link][1 comment]