🐦 Opytimizer is a Python library consisting of meta-heuristic optimization techniques.

Overview

Opytimizer: A Nature-Inspired Python Optimizer

Latest release DOI Build status Open issues License

Welcome to Opytimizer.

Did you ever reach a bottleneck in your computational experiments? Are you tired of selecting suitable parameters for a chosen technique? If yes, Opytimizer is the real deal! This package provides an easy-to-go implementation of meta-heuristic optimizations. From agents to search space, from internal functions to external communication, we will foster all research related to optimizing stuff.

Use Opytimizer if you need a library or wish to:

  • Create your optimization algorithm;
  • Design or use pre-loaded optimization tasks;
  • Mix-and-match different strategies to solve your problem;
  • Because it is fun to optimize things.

Read the docs at opytimizer.readthedocs.io.

Opytimizer is compatible with: Python 3.6+.


Package guidelines

  1. The very first information you need is in the very next section.
  2. Installing is also easy if you wish to read the code and bump yourself into, follow along.
  3. Note that there might be some additional steps in order to use our solutions.
  4. If there is a problem, please do not hesitate, call us.
  5. Finally, we focus on minimization. Take that in mind when designing your problem.

Citation

If you use Opytimizer to fulfill any of your needs, please cite us:

@misc{rosa2019opytimizer,
    title={Opytimizer: A Nature-Inspired Python Optimizer},
    author={Gustavo H. de Rosa, Douglas Rodrigues and João P. Papa},
    year={2019},
    eprint={1912.13002},
    archivePrefix={arXiv},
    primaryClass={cs.NE}
}

Getting started: 60 seconds with Opytimizer

First of all. We have examples. Yes, they are commented. Just browse to examples/, chose your subpackage, and follow the example. We have high-level examples for most tasks we could think of and amazing integrations (Learnergy, NALP, OPFython, PyTorch, Scikit-Learn, Tensorflow).

Alternatively, if you wish to learn even more, please take a minute:

Opytimizer is based on the following structure, and you should pay attention to its tree:

- opytimizer
    - core
        - agent
        - function
        - node
        - optimizer
        - space
    - functions
        - weighted
    - math
        - distribution
        - general
        - hyper
        - random
    - optimizers
        - boolean
        - evolutionary
        - misc
        - population
        - science
        - social
        - swarm
    - spaces
        - boolean
        - grid
        - hyper_complex
        - search
        - tree
    - utils
        - constants
        - decorator
        - exception
        - history
        - logging
    - visualization
        - convergence
        - surface

Core

Core is the core. Essentially, it is the parent of everything. You should find parent classes defining the basis of our structure. They should provide variables and methods that will help to construct other modules.

Functions

Instead of using raw and straightforward functions, why not try this module? Compose high-level abstract functions or even new function-based ideas in order to solve your problems. Note that for now, we will only support multi-objective function strategies.

Math

Just because we are computing stuff, it does not means that we do not need math. Math is the mathematical package, containing low-level math implementations. From random numbers to distributions generation, you can find your needs on this module.

Optimizers

This is why we are called Opytimizer. This is the heart of the heuristics, where you can find a large number of meta-heuristics, optimization techniques, anything that can be called as an optimizer. Please take a look on the available optimizers.

Spaces

One can see the space as the place that agents will update their positions and evaluate a fitness function. However, the newest approaches may consider a different type of space. Thinking about that, we are glad to support diverse space implementations.

Utils

This is a utility package. Common things shared across the application should be implemented here. It is better to implement once and use as you wish than re-implementing the same thing over and over again.

Visualization

Everyone needs images and plots to help visualize what is happening, correct? This package will provide every visual-related method for you. Check a specific variable convergence, your fitness function convergence, plot benchmark function surfaces, and much more!


Installation

We believe that everything has to be easy. Not tricky or daunting, Opytimizer will be the one-to-go package that you will need, from the very first installation to the daily-tasks implementing needs. If you may just run the following under your most preferred Python environment (raw, conda, virtualenv, whatever):

pip install opytimizer

Alternatively, if you prefer to install the bleeding-edge version, please clone this repository and use:

pip install -e .

Environment configuration

Note that sometimes, there is a need for additional implementation. If needed, from here, you will be the one to know all of its details.

Ubuntu

No specific additional commands needed.

Windows

No specific additional commands needed.

MacOS

No specific additional commands needed.


Support

We know that we do our best, but it is inevitable to acknowledge that we make mistakes. If you ever need to report a bug, report a problem, talk to us, please do so! We will be available at our bests at this repository or [email protected].


Comments
  • [BUG] AttributeError: 'History' object has no attribute 'show'

    [BUG] AttributeError: 'History' object has no attribute 'show'

    Describe the bug A clear and concise description of what the bug is.

    It looks like there is no show() method for the returned opytimizer history.

    To Reproduce Steps to reproduce the behavior:

    1. Follow the steps from the wiki Tutorial: Your first optimization
      1. Run the optimizer with o.start(): o = Opytimizer(space=s, optimizer=p, function=f) history = o.start()
      2. Show the history: history.how()

    Expected behavior Not sure what I expected, just curious :)

    Screenshots

    2020-01-02 13:26:28,270 - opytimizer.optimizers.fa — INFO — Iteration 1000/1000
    2020-01-02 13:26:28,278 - opytimizer.optimizers.fa — INFO — Fitness: 4.077875713322641e-14
    2020-01-02 13:26:28,279 - opytimizer.optimizers.fa — INFO — Position: [[-1.42791381e-07]
     [-1.42791381e-07]]
    2020-01-02 13:26:28,279 - opytimizer.opytimizer — INFO — Optimization task ended.
    2020-01-02 13:26:28,279 - opytimizer.opytimizer — INFO — It took 7.672951936721802 seconds.
    >>> history.show()
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    AttributeError: 'History' object has no attribute 'show'
    >>> history
    <opytimizer.utils.history.History object at 0x113b75be0>
    >>> history.show()
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    AttributeError: 'History' object has no attribute 'show'
    

    Desktop (please complete the following information):

    • OS: macOS Mojave 10.14.6
    • Virtual Environment: conda base
    • Python Version:
    (base) justinmai$ python3
    Python 3.7.3 (default, Mar 27 2019, 16:54:48) 
    [Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin
    

    Additional context Add any other context about the problem here.

    bug 
    opened by justinTM 5
  • [REG] How to supress DEBUG log message in opytimizer.core.space

    [REG] How to supress DEBUG log message in opytimizer.core.space

    Hello, After inititializing Searchspace there is a debug message that is printed to stdout. How can we turn it off/on? Following is the message. opytimizer.core.space — DEBUG — Agents: 25 |....

    I believe its printed because of line #223 in file opytimizer/core/space.py.

    For large dimensions it prints all the lower and upper bounds which we may not always require.

    Thanks.

    general 
    opened by amir1m 4
  • [NEW] Dump optimization progress

    [NEW] Dump optimization progress

    Is your feature request related to a problem? Please describe. If the server running fails. We lose the time running.

    Describe the solution you'd like Dump the optimization object from a time to time.

    Describe alternatives you've considered Maybe dump the agents?

    enhancement 
    opened by gcoimbra 4
  • [NEW] Constrained optimization

    [NEW] Constrained optimization

    Hi, thanks for the work.

    Any plans to add functionality to define constraints for the optimization? For instance, inequalities or any arbitrary non-linear constraints on the inputs and/or on the outputs?

    enhancement 
    opened by ogencoglu 4
  • [NEW] Using population data for population-based algorithms?

    [NEW] Using population data for population-based algorithms?

    Hello, First of all, thank you for sharing such a fantastic repo that can be used as an off-the-shelf meta-heuristic optimization algorithms!

    I have a question regarding how to use my own population data for optimizers in optimizer. Rather than using SearchSpace that uses predetermined upper/lower bounds, is there any way I can use my own population samples to start the optimization from?

    Thank you, hope you have a wonderful day!

    enhancement 
    opened by jimmykimmy68 3
  • [REG] What is the difference between grid space, and discrete space example?

    [REG] What is the difference between grid space, and discrete space example?

    Greetings,

    I have a mixed search space problem of five dimensions, 4 discrete and one continuous. how to implement a discrete search space, with these dimensions where the increment won't be the same. I found that grid space offer me the flexibility I need, yet I noticed you used a different implementation in discrete space example so which one should I use?

    My search space:

    step = [1, 1, 1, 1, 0.1]
    lower_bound = [16, 3, 2, 0, 0]
    upper_bound = [400, 20, 20, 1, 0.33]
    

    Thanks in advance.

    general 
    opened by TamerAbdelmigid 3
  • [NEW] Different number of step for each variable

    [NEW] Different number of step for each variable

    Is your feature request related to a problem? Please describe. It's not

    Describe the solution you'd like I'd like a different step size for each

    Additional context Sometimes variables are less sensible than others. Some are integers. Is there anyway to use a different step for each one?

    enhancement 
    opened by gcoimbra 3
  • [REG] How to get a detailed print out during optimization?

    [REG] How to get a detailed print out during optimization?

    Greetings,

    My function takes time getting evaluated, and I want to closely monitor what happens during optimization process. When using grid search space and printing the fitness from my function, I could see the movement along the grid. But, when using normal search space, my processor utilization is 100% and it takes hours with no print out. So, hot to get more details? is there something like a degree of verbosity?

    Thanks in advance.

    general 
    opened by TamerAbdelmigid 2
  • [NEW] Define objective function for regression problem

    [NEW] Define objective function for regression problem

    Hi there,

    I attempted to define an objective function (using CatBoost model for data) to solve a minimum problem in a regression task, however failed to create new objective function. So, do your package offer the solution for regression and can we define such an objective function in this case?

    My desired objective function something like this to minimize the MSE:

    from catboost import CatBoostRegressor as cbr
    cbr_model = cbr()
    
    def objective_function(cbr_model,X_train3, y_train3, X_test3, y_test3):      
        cbr_model.fit(X_train3,y_train3)  
        mse=mean_squared_error(y_test3,cbr_model.predict(X_test3))
        return mse
    

    Many thanks, Thang

    enhancement 
    opened by hanamthang 2
  • [REG]How to plot convergence diagram?

    [REG]How to plot convergence diagram?

    Hello, I was looking for convergence diagram. Found an example of using convergence function opytimizer/examples/visualization/convergence_plotting.py /. However, it uses few constant values of agent positions. Is there a convergence example that shows how to use this function with actual optimization problem such as after carrying out PSO?

    Thanks,

    general 
    opened by amir1m 2
  • [REG]Is there a way to provide initial values before starting optimization?

    [REG]Is there a way to provide initial values before starting optimization?

    Hello, Hope you keeping well.

    I am looking to provide an initial value to optimization loop. Is there a way through searchspace or otherwise to provide initial/starting solution?

    Thanks.

    general 
    opened by amir1m 2
Releases(v3.1.2)
  • v3.1.2(Sep 9, 2022)

    Changelog

    Description

    Welcome to v3.1.2 release.

    In this release, we added variables name mapping to search spaces

    Includes (or changes)

    • core.search_space
    Source code(tar.gz)
    Source code(zip)
  • v3.1.1(May 4, 2022)

    Changelog

    Description

    Welcome to v3.1.1 release.

    In this release, we added pre-commit hooks and annotated typing.

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v3.1.0(Jan 7, 2022)

    Changelog

    Description

    Welcome to v3.1.0 release.

    In this release, we implemented the initial parts for holding graph-related searches, which will support Neural Architecture Search (NAS) in the feature.

    Additionally, we have added the first algorithm for calculating the Pareto frontier of pre-defined points (Non-Dominated Sorting).

    Includes (or changes)

    • core
    • optimizers
    • spaces
    Source code(tar.gz)
    Source code(zip)
  • v3.0.2(Jun 28, 2021)

    Changelog

    Description

    Welcome to v3.0.2 release.

    In this release, we implemented the remaining meta-heuristics that were on hold. Please note that they are supposed to be 100% working, yet we still need to experimentally evaluate their performance.

    Additionally, an important hot-fix regarding the calculation of Euclidean Distance has been corrected.

    Includes (or changes)

    • optimizers
    • math.general
    Source code(tar.gz)
    Source code(zip)
  • v3.0.1(May 31, 2021)

    Changelog

    Description

    Welcome to v3.0.1 release.

    In this release, we have added a bunch of meta-heuristics that were supposed to be implemented earlier.

    Includes (or changes)

    • optimizers
    Source code(tar.gz)
    Source code(zip)
  • v3.0.0(May 12, 2021)

    Changelog

    Description

    Welcome to v3.0.0 release.

    In this release, we have revamped the whole library, rewriting base packages, such as agent, function, node and space, as well as implementing new features, such as callbacks and a better Opytimizer bundler.

    Additionally, we have rewritten every optimizer and their tests, removing more than 2.5k lines that were tagged as "repeatable". Furthermore, we have removed excessive commentaries to provide a cleaner reading and have rewritten every example to include the newest features.

    Please take a while to check our most important advancements and read the docs at: opytimizer.readthedocs.io

    Note that this is a major release and we expect everyone to update their corresponding packages, as this update will not work with v2.x.x versions.

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v2.1.4(Apr 28, 2021)

    Changelog

    Description

    Welcome to v2.1.4 release.

    In this release, we have added the following optimizers: AOA and AO. Additionally, we have implemented a step size for each variable in the grid, as pointed by @gcoimbra.

    Please read the docs at: opytimizer.readthedocs.io

    Note that this is the latest update of v2 branch. The following update will feature a major rework on Opytimizer classes and will not be retroactive with past versions.

    Includes (or changes)

    • optimizers.misc.aoa
    • optimizers.population.ao
    • optimizers.spaces.grid
    Source code(tar.gz)
    Source code(zip)
  • v2.1.3(Mar 10, 2021)

    Changelog

    Description

    Welcome to v2.1.3 release.

    In this release, we have added the following optimizers: COA, JS and NBJS. Additionally, we have fixed the roulette selection for minimization problems, as pointed by @Martsks.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers.evolutionary.ga
    • optimizers.population.coa
    • optimizers.swarm.js
    Source code(tar.gz)
    Source code(zip)
  • v2.1.2(Dec 3, 2020)

    Changelog

    Description

    Welcome to v2.1.2 release.

    In this release, we have added a set of new optimizers, as follows: ABO, ASO, BOA, BSA, CSA, DOA, EHO, EPO, GWO, HGSO, HHO, MFO, PIO, QSA, SFO, SOS, SSA, SSO, TWO, WOA and WWO.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers
    Source code(tar.gz)
    Source code(zip)
  • v2.1.1(Nov 25, 2020)

    Changelog

    Description

    Welcome to v2.1.1 release.

    In this release, we have added some HS-based optimizers and fixed an issue regarding the store_best_only flag.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers
    • utils.history
    Source code(tar.gz)
    Source code(zip)
  • v2.1.0(Jul 29, 2020)

    Changelog

    Description

    Welcome to v2.1.0 release.

    In this release, we have fixed the UMDA nomenclature, corrected some optimizers docstrings and added a soft-penalization to constrained optimization, which we believe that will enable users in designing more appropriate constrained objectives. Furthermore, we added a deterministic trait to the optimizers' unitary tests so they are able to converge.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • core.function
    • optimizers
    • tests
    Source code(tar.gz)
    Source code(zip)
  • v2.0.2(Jul 10, 2020)

    Changelog

    Description

    Welcome to v2.0.2 release.

    In this release, we have added BMRFO, SAVPSO, UDMA and VPSO optimizers.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers.boolean.bmrfo
    • optimizers.boolean.udma
    • optimizers.swarm.pso
    Source code(tar.gz)
    Source code(zip)
  • v2.0.1(Jun 26, 2020)

  • v2.0.0(May 7, 2020)

    Changelog

    Description

    Welcome to v2.0.0 release.

    In this release, we have reworked some inner structures, added the usability of decorators and revamped some optimizers. Additionally, we added a bunch of new optimizers and their tests.

    Note that this is a major release, therefore, we suggest to update the library as soon as possible, as it will not be compatible with older versions.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v1.1.3(Mar 31, 2020)

    Changelog

    Description

    Welcome to v1.1.3 release.

    In this release, we have improved the code readability, as well as we have merged some child optimizers to their parents' modules.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • opytimizer
    Source code(tar.gz)
    Source code(zip)
  • v1.1.2(Feb 20, 2020)

    Changelog

    Description

    Welcome to v1.1.2 release.

    In this release, we have fixed some nasty bugs that may cause some particular issues. Additionally, we added a method for handling constrained optimization.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • core.function
    Source code(tar.gz)
    Source code(zip)
  • v1.1.1(Jan 15, 2020)

    Changelog

    Description

    Welcome to v1.1.1 release.

    In this release, we have fixed some nasty bugs that may cause some particular issues. Additionally, we added several new optimizers to the library and a convergence module (belongs to the visualization package).

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • optimizers.gsa
    • optimizer.hc
    • otimizer.rpso
    • optimizer.sa
    • optimizer.sca
    • visualization.convergence
    Source code(tar.gz)
    Source code(zip)
  • v1.1.0(Oct 3, 2019)

    Changelog

    Description

    Welcome to v1.1.0 release.

    This is a minor release that will probably prevent retro-compatibility. Some base structures were changed in order to enhance the library performance.

    Essentially, new structures were created to provide the bulding tools for tree-based evolutionary algorithms, such as Genetic Programming.

    Additionally, we added a constants module to the utilities package and an exception module. This will help guiding the users when inputting wrong information.

    MultiFunction was renamed to WeightedFunction, which is more suitable according to its usage.

    History have been reworked as well, making it possible to dynamically create properties through the dump() method.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • examples/applications
    • core.node
    • function.weighted
    • math.general
    • optimizers.gp
    • opytimizer
    • spaces.tree
    • tests
    • utils.constants
    • utils.exceptions
    • utils.history
    Source code(tar.gz)
    Source code(zip)
  • v1.0.7(Sep 12, 2019)

    Changelog

    Description

    Welcome to v1.0.7 release. We added Opytimizer to pip repository, fixed up the optimization task time, reworked some tests, added new integrations (Recogners library), changed our license for further publication, and fixed an issue regarding agents being out of bounds.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes (or changes)

    • examples/integrations/recogners
    • opytimizer
    • optimizers
    • tests
    Source code(tar.gz)
    Source code(zip)
  • v1.0.6(Jun 21, 2019)

    Changelog

    Description

    Welcome to v1.0.6 release. We added new interesting things, such as a new optimizer, more benchmarking functions, bug fixes (mostly agents being out of bounds), a reworked history object and some adjusted tests.

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes

    • math.benchmark
    • optimizers.wca
    • utils.history
    • tests
    Source code(tar.gz)
    Source code(zip)
  • v1.0.5(Apr 18, 2019)

    Changelog

    Description

    Welcome to v1.0.5 release. We added new interesting things, such as new optimizers, some reworked tests for better uses' cases, a multi-objective strategy for handling problems with more than one objective functions, and much more!

    Please read the docs at: opytimizer.readthedocs.io

    Also, stay tuned for our next updates!

    Includes

    • examples.integrations.sklearn
    • functions.multi
    • optimizers.abc
    • optimizers.bha
    • optimizers.hs
    • tests
    • optimizers.ihs
    Source code(tar.gz)
    Source code(zip)
  • v1.0.4(Mar 22, 2019)

  • v1.0.3(Mar 22, 2019)

    Changelog

    Description

    Welcome to v1.0.3 release. We have added methods for supporting new optimizers. Additionally, we have fixed some previous implementations and improved their convergence. Everything should be appropriate now.

    Some examples integrating PyTorch with Opytimizer were created as well. Ranging from linear regression to long short-term memory networks, we hope to continue improving our library to serve you well.

    Again, every test is implemented, making a 100% score of coverage. Please refer to the wiki in order to running them.

    Please stay tuned for our next updates and our newest integrations (Sklearn and Tensorflow)!

    Includes

    • optimizers.aiwpso
    • optimizers.ba
    • optimizers.cs
    • optimizers.fa
    • examples.integrations.pytorch
    Source code(tar.gz)
    Source code(zip)
  • v1.0.2(Mar 7, 2019)

    Changelog

    Description

    Welcome to v1.0.2 release. We have added methods for supporting hypercomplex representations. From math modules to new spaces, we do support any hypercomplex approach, ranging from complexes to octonions.

    A History class has been added as well. It will server as the one to hold vital information from the optimization task. In the future, we will support visualization and plots graphing.

    Also, internal class has been removed. All of its contents were moved to core.function module. For now, this will be our new structure (there is a slighly chance to be modified in the future to accomodate multi-objective functions).

    Finally, we have tests. Every test is implemented, making a 100% score of coverage. Please refer to the wiki in order to running them.

    Please stay tuned for our next updates!

    Includes

    • math.hypercomplex
    • spaces
    • utils.history
    • tests (100% coverage)

    Excludes

    • functions
    Source code(tar.gz)
    Source code(zip)
  • v1.0.1(Feb 26, 2019)

    Changelog

    Description

    Welcome to v1.0.1 release. Essentialy, we have reworked some basic structures, added a new math distribution module and a new optimizer (Flower Pollination Algorithm). Please stay tuned for our next updates!

    Includes

    • math.distribution
    • optimizers.fpa
    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Feb 26, 2019)

    Changelog

    Description

    This is the initial release of Opytimizer. It includes all basic modules in order to work with it. One can create an internal optimization function and apply a Particle Swarm Optimization optimizer onto it. Please check examples folder or read the docs in order to know how to use this library.

    Includes

    • core
    • functions
    • math
    • optimizers
    • utils
    Source code(tar.gz)
    Source code(zip)
Owner
Gustavo Rosa
There are no programming languages that can match up to programming logic. Machine learning researcher on work time and software engineer on free time.
Gustavo Rosa
A simple API wrapper for Discord interactions.

Your ultimate Discord interactions library for discord.py. About | Installation | Examples | Discord | PyPI About What is discord-py-interactions? dis

james 641 Jan 03, 2023
Circuit Training: An open-source framework for generating chip floor plans with distributed deep reinforcement learning

Circuit Training: An open-source framework for generating chip floor plans with distributed deep reinforcement learning. Circuit Training is an open-s

Google Research 479 Dec 25, 2022
ACL'2021: LM-BFF: Better Few-shot Fine-tuning of Language Models

LM-BFF (Better Few-shot Fine-tuning of Language Models) This is the implementation of the paper Making Pre-trained Language Models Better Few-shot Lea

Princeton Natural Language Processing 607 Jan 07, 2023
A Pytorch Implementation of a continuously rate adjustable learned image compression framework.

GainedVAE A Pytorch Implementation of a continuously rate adjustable learned image compression framework, Gained Variational Autoencoder(GainedVAE). N

39 Dec 24, 2022
Powerful unsupervised domain adaptation method for dense retrieval.

Powerful unsupervised domain adaptation method for dense retrieval

Ubiquitous Knowledge Processing Lab 191 Dec 28, 2022
This program uses trial auth token of Azure Cognitive Services to do speech synthesis for you.

🗣️ aspeak A simple text-to-speech client using azure TTS API(trial). 😆 TL;DR: This program uses trial auth token of Azure Cognitive Services to do s

Levi Zim 359 Jan 05, 2023
3DV 2021: Synergy between 3DMM and 3D Landmarks for Accurate 3D Facial Geometry

SynergyNet 3DV 2021: Synergy between 3DMM and 3D Landmarks for Accurate 3D Facial Geometry Cho-Ying Wu, Qiangeng Xu, Ulrich Neumann, CGIT Lab at Unive

Cho-Ying Wu 239 Jan 06, 2023
ICCV2021, Tokens-to-Token ViT: Training Vision Transformers from Scratch on ImageNet

Tokens-to-Token ViT: Training Vision Transformers from Scratch on ImageNet, ICCV 2021 Update: 2021/03/11: update our new results. Now our T2T-ViT-14 w

YITUTech 1k Dec 31, 2022
'Aligned mixture of latent dynamical systems' (amLDS) for stimulus decoding probabilistic manifold alignment across animals. P. Herrero-Vidal et al. NeurIPS 2021 code.

Across-animal odor decoding by probabilistic manifold alignment (NeurIPS 2021) This repository is the official implementation of aligned mixture of la

Pedro Herrero-Vidal 3 Jul 12, 2022
Pytorch0.4.1 codes for InsightFace

InsightFace_Pytorch Pytorch0.4.1 codes for InsightFace 1. Intro This repo is a reimplementation of Arcface(paper), or Insightface(github) For models,

1.5k Jan 01, 2023
We have made you a wrapper you can't refuse

We have made you a wrapper you can't refuse We have a vibrant community of developers helping each other in our Telegram group. Join us! Stay tuned fo

20.6k Jan 09, 2023
Ladder Variational Autoencoders (LVAE) in PyTorch

Ladder Variational Autoencoders (LVAE) PyTorch implementation of Ladder Variational Autoencoders (LVAE) [1]: where the variational distributions q at

Andrea Dittadi 63 Dec 22, 2022
OCR Post Correction for Endangered Language Texts

📌 Coming soon: an update to the software including features from our paper on semi-supervised OCR post-correction, to be published in the Transaction

Shruti Rijhwani 96 Dec 31, 2022
OCR-D wrapper for detectron2 based segmentation models

ocrd_detectron2 OCR-D wrapper for detectron2 based segmentation models Introduction Installation Usage OCR-D processor interface ocrd-detectron2-segm

Robert Sachunsky 13 Dec 06, 2022
Distributed Evolutionary Algorithms in Python

DEAP DEAP is a novel evolutionary computation framework for rapid prototyping and testing of ideas. It seeks to make algorithms explicit and data stru

Distributed Evolutionary Algorithms in Python 4.9k Jan 05, 2023
Dataset and Code for ICCV 2021 paper "Real-world Video Super-resolution: A Benchmark Dataset and A Decomposition based Learning Scheme"

Dataset and Code for RealVSR Real-world Video Super-resolution: A Benchmark Dataset and A Decomposition based Learning Scheme Xi Yang, Wangmeng Xiang,

Xi Yang 92 Jan 04, 2023
Using multidimensional LSTM neural networks to create a forecast for Bitcoin price

Multidimensional LSTM BitCoin Time Series Using multidimensional LSTM neural networks to create a forecast for Bitcoin price. For notes around this co

Jakob Aungiers 318 Dec 14, 2022
💡 Learnergy is a Python library for energy-based machine learning models.

Learnergy: Energy-based Machine Learners Welcome to Learnergy. Did you ever reach a bottleneck in your computational experiments? Are you tired of imp

Gustavo Rosa 57 Nov 17, 2022
Car Price Predictor App used to predict the price of the car based on certain input parameters created using python's scikit-learn, fastapi, numpy and joblib packages.

Pricefy Car Price Predictor App used to predict the price of the car based on certain input parameters created using python's scikit-learn, fastapi, n

Siva Prakash 1 May 10, 2022
Code release for "Masked-attention Mask Transformer for Universal Image Segmentation"

Mask2Former: Masked-attention Mask Transformer for Universal Image Segmentation Bowen Cheng, Ishan Misra, Alexander G. Schwing, Alexander Kirillov, Ro

Meta Research 1.2k Jan 02, 2023