Continued fraction in MeTTa code for OpenCog Hyperon

This notebook is an example of finding an arithmetic expression that evaluates to a given number. In this case, the arithmetic expression shall take the form of a continued fraction such as 1 / (4 + 1 / (4 +...)) and evaluate to Euler's number e. The code evolves a simple MeTTa program that can be evaluated with the current experimental version of OpenCog Hyperon.

References

In [1]:
from math import e

import alogos as al
import unified_map as um
from hyperon.common import MeTTa

metta = MeTTa()

Definition of search space and goal

1) Grammar

This grammar defines the search space: a MeTTa program that represents an arithmetic expression having the form of a continued fraction.

In [2]:
ebnf_text = """
FRAC = "(+ " CONST " (/ " CONST " " FRAC "))" | CONST
CONST = "1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9"
"""

grammar = al.Grammar(ebnf_text=ebnf_text)

2) Objective function

The objective function gets a candidate solution (=a string of the grammar's language) and returns a fitness value for it. This is done by 1) executing the string as MeTTa program with OpenCog Hyperon, which evaluates the arithmetic expression and returns the number it produces and 2) determining how much this number differs from a target number.

In [3]:
def objective_function(string):
    if len(string) > 200:
        return None  # prevent too long expressions
    result = metta.interpret(string)
    number = result[0].get_object().value
    return abs(number-e)

Generation of a random solution

Check if grammar and objective function work as intended.

In [4]:
random_string = grammar.generate_string()
print(random_string)
(+ 2 (/ 3 4))
In [5]:
result = metta.interpret(random_string)
print(result)
[2.75]
In [6]:
objective_function(random_string)
Out[6]:
0.03171817154095491

Search for an optimal solution

Evolutionary optimization with random variation and non-random selection is used to find increasingly better candidate solutions.

1) Parameterization

In [7]:
num_gen = 100
num_ind = 300

ea = al.EvolutionaryAlgorithm(
    grammar, objective_function, 'min', verbose=True,
    max_or_min_fitness=0.0, max_generations=num_gen, population_size=num_ind, offspring_size=num_ind,
    evaluator=um.univariate.parallel.futures
)

2) Run

The search is performed one generation after another and some intermediate results are reported to see how the solutions improve gradually.

In [8]:
best_ind = ea.run()
Progress         Generations      Evaluations      Runtime (sec)    Best fitness    
..... .....      10               2934             6.1              0.0003331105103270282
..... .....      20               5804             9.7              1.7536305070287028e-06
..... .....      30               8688             13.5             1.9152178953874e-08
..... .....      40               11598            17.7             3.915531454623533e-09
..... .....      50               14481            22.7             3.8033132199188913e-10
..... .....      60               17398            29.7             1.4483969579259792e-11
..... .....      70               20280            36.4             1.4483969579259792e-11
..... .....      80               23171            43.6             1.0049738818906917e-12
..... .....      90               26093            52.8             1.0049738818906917e-12
..... .....      100              28999            61.6             1.0049738818906917e-12


Finished         100              28999            62.5             1.0049738818906917e-12

3) Result

Show the phenotype of the best individual found so far. If more computing time is provided, increasingly better solutions may be discovered.

In [9]:
string = best_ind.phenotype
print(string)
(+ 2 (/ 3 (+ 4 (/ 1 (+ 5 (/ 7 (+ 9 (/ 9 (+ 5 (/ 7 (+ 9 (/ 5 (+ 3 (/ 5 (+ 4 (/ 2 (+ 3 (/ 8 (+ 5 (/ 7 (+ 4 (/ 9 (+ 1 (/ 8 (+ 5 (/ 7 (+ 4 (/ 3 5))))))))))))))))))))))))))))
In [10]:
print('Target:', e)
print('Found: ', metta.interpret(string)[0])
Target: 2.718281828459045
Found:  2.71828182846005