Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add reconstruct_objs feature #576

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

waterfall-xi
Copy link
Contributor

A new feature is added to the iterative process of the optimization algorithm. Users can customize a function and assign it to the Algorithm through kwsargs, call the function in each iteration in run() and update the objs to implement the dynamic objective function in the iteration.

For some engineering problems, it may be desirable to dynamically change the objective functions of this iteration at iteration time to solve some issue, which I believe will be an exciting feature.

A new feature is added to the iterative process of the optimization algorithm. Users can customize a function and assign it to the Algorithm through kwsargs, call the function in each iteration in run() and update the objs to implement the dynamic objective function in the iteration.
@blankjul blankjul self-assigned this Mar 16, 2024
@blankjul
Copy link
Collaborator

Can you eloborate a little more on the use case?

Why can someone not change use the ask-and-tell interface and set the problem of the algorithm class?
I would like to call out that this is a little dangerous too because individuals evaluated on the "old" problem will not be re-evaluated.

A new feature is added to the iterative process of the optimization algorithm. Users can customize a function and assign it to the Algorithm through kwsargs, call the function in each iteration in run() and update the objs to implement the dynamic objective function in the iteration. reconstruct_objs_func() is required to  include the arguement algo and can be used to complete the objs reconstruction using the information in algo.
@waterfall-xi
Copy link
Contributor Author

Thanks for your comment! A new commit reconstruct_objs according algorithm self was added. Following is a use case of committed reconstruct_objs_func feature.

A multi-objective function dimensionality reduction solution implemented by reconstruct_objs_func, reconstruct_objs_func() can be customized by the user and called in each iteration to update the objs, in this case, the reconstruct_objs_func() is designed to input the fixed sub-objective function, random sub-objective function, number of samples, number of random sub-objectives and the algorithm object, and are used to reconstruct the new multi-objective function by sampling part of the random objective function according to information from algo object.

First, define the base func of reconstruct_objs.

def reconstruct_objs_base(fixed_obj, random_obj_fn, n_sample, n_random, algo=None):
    """
        Function to achieve iterate-based dynamic objectives function. There is an objective that is
        fixed in the objectives, and the rest of the n_sample objectives are randomly sampled from
        random_obj.

    Args:
        fixed_obj: fixed objective func, form: object1(x)
        random_obj_fn: random objective function, form: random_obj(x, idx) -> function, it determines the random
            objective according the idx and returns output with input argument x.
        n_sample (int): number of random objective sampling
        n_random (int): number of random objective, sample n_sample objectives from those n_random
            objectives
        algo: custom algorithm object, some information can be obtained from it for objectives construction

    Returns:
        new_objectives (list): list of new objectives

    """
    new_objectives = [fixed_obj]
    if algo is not None:
        # NOTE: Here, user is free to design how to use the information in the algo object to determine the sampling
        #       probabilities of random objectives
        p = numpy.array(algo.random_objs_sample_prob)
    else:
        p = numpy.array([1 / n_random] * n_random)

    # NOTE: Here, user can also use the information in the algo object to design a series of operations to determine the
    #       sampling index
    sample_by_algo = True
    if sample_by_algo:
        # determine the sampling index according to the algo object
        sample_idx = numpy.array([0, 2])  # Just an example, the first and third objectives are sampled
    else:
        sample_idx = numpy.choice(n_random, n_sample, replace=False, p=p)

    # Update some information to algo, the user is free to design
    algo.a = 1
    algo.b = 2

    for idx in sample_idx:
        def one_random_obj(x, idx=idx):  # Sample index is 'remembered' by default argument
            return random_obj_fn(x, idx)

        # The sampled random objectives are included in the new objectives
        new_objectives.append(one_random_obj)

    return new_objectives

Due to the form of the self.reconstruct_objs_func(algo=self) call in algorithm.run(). We additionally define reconstruct_objs that all arguments are assigned the local variable as the default argument (According to the nature of Python closures, these default arguments are not updated with the local variable).

    # def obj_1(x):
    # def obj_2(x):
    # def obj_3(x):
    # def obj_4(x):
    # def obj_5(x):

    def random_obj_fn(x, idx):
        random_obj_l = [obj_2, obj_3, obj_4, obj_5]
        return random_obj_l[idx](x)

    ojectives = [obj_1, obj_2, obj_3]  # initial objectives for problem creation

    def reconstruct_objs(fixed_obj=obj_1, random_obj_fn=random_obj_fn, n_sample=2, n_random=4, algo=None):
        return reconstruct_objs_base(fixed_obj=fixed_obj, random_obj_fn=random_obj_fn, 
                                     n_sample=n_sample, n_random=n_random, algo=algo)
    
    kws = {  # to algorithm object
        'reconstruct_objs_func': reconstruct_objs,
    }

This case shows how to design reconstruct_objs_func function to achieve the iterative-based dynamic objective functions that the user wants.

I prefer to assign the new objectives to objs member variables than set the problem of the algorithm class because only objs need to change in this. The solution in this commit may not be in Pymmo's style, and you are free to change it to fit the whole project.

I understand that changing the objective function during optimization may sound unusual. Still, in real engineering problems, it may be difficult to determine the appropriate objective function, and constantly adjusting the objective function during the optimization process may be a solution. The use of a new objective function (problem) in an iteration by each individual is the way to facilitate this approach. In the case I gave, the fixed objective function represents the primary objective and must occupy a position in each iteration, while the random objective function represents the minor, redundant, and auxiliary objective function, which can optionally be added to reduce the pressure on the objective dimension. For reconstruct_objs_func feature, users have the freedom to design customized dynamic objective function schemes to solve their problems.

By the way, it's unfortunate that the commit doesn't pass the Testing/testing checks (tests/algorithms/test_no_modfication.py::test_no_mod[zdt1-entry0] is failed), and I'm a newbie about this. In my view, the commit is OK and expected to pass the test. I'm now stuck with how to debug to pass testing.

@blankjul
Copy link
Collaborator

I would like to call out that changing the objective function during the optimization is part of dynamic optimization. Most algorithms are proposed for the static use case. I agree that dynamic optimization has also quite a few use cases though! There is also an algorithm in pymoo (https://pymoo.org/algorithms/moo/dnsga2.html) which re-evaluates then the objective function to adapt the current population to the change in the objective.

I will keep this issue open for now and want to see if other uses also see the need for having this as a standard method in pymoo. Right now my feeling is the is a very special behavior for actually dynamic optimization problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants