Repository of best practices for deep learning in Julia, inspired by fastai

Overview

FastAI

Docs: Stable | Dev

FastAI.jl is inspired by fastai, and is a repository of best practices for deep learning in Julia. Its goal is to easily enable creating state-of-the-art models. FastAI enables the design, training, and delivery of deep learning models that compete with the best in class, using few lines of code.

Install with

using Pkg
Pkg.add("FastAI")

or try it out with this Google Colab template.

As an example, here is how to train an image classification model:

using FastAI
data, blocks = loaddataset("imagenette2-160", (Image, Label))
method = ImageClassificationSingle(blocks)
learner = methodlearner(method, data, callbacks=[ToGPU()])
fitonecycle!(learner, 10)
showoutputs(method, learner)

Please read the documentation for more information and see the setup instructions.

Comments
  • Add Time Series Block

    Add Time Series Block

    Added Time Series Container and Block. It is capable of loading all datasets from timeseriesclassification. The .ts files are loaded using Julia translation of this method .

    I have also added a basic test case for the recipe. This allows us to do the following

    using FastAI
    
    data, blocks = load(datarecipes()["ecg5000"])
    nobs(data)
    sample = series, class = getobs(data, 10)
    

    Just wanted to get some initial thoughts on the work, there might be more changes as I continue to work on the other parts.

    opened by codeboy5 23
  • TagBot trigger issue

    TagBot trigger issue

    This issue is used to trigger TagBot; feel free to unsubscribe.

    If you haven't already, you should update your TagBot.yml to include issue comment triggers. Please see this post on Discourse for instructions and more details.

    If you'd like for me to do this for you, comment TagBot fix on this issue. I'll open a PR within a few hours, please be patient!

    opened by JuliaTagBot 17
  • Blocks and container added for Text Dataset

    Blocks and container added for Text Dataset

    Registered the NLP ( Text ) dataset to be added in the upcoming months. Added functions for the blocks of the Text dataset. All the nlp dataset ( which are registered ) along with their forthcoming models will be added . Exploring Julia Text, MLutils and other package along with FastAI concepts so that these datasets can work well with Flux. As almost all the text datasets are in csv format it will be easily lo load them and create the corresponding container, working on further concepts to implement these text datasets.

    Currently I have added the entire basic structure of the Text Data comprising of the blocks and the containers. Have researched a lot since a week ( understanding FastAI docs and codebase ). Currently working on adding textrow block along with the recipes.jl. Also currently working on two datasets "imdb" and "amazon_review_full" as both have different folder structure so different blocks would be required. Also going through the 2 papers which have built state of the art model for these two datasets and working on its implementation. Any reviews thus far will be appreciated.

    Reopened PR#100 , needed to delete that repo due to merging issue.

    opened by arcAman07 12
  • InceptionTime Model for Time Series

    InceptionTime Model for Time Series

    This PR will contain the implementation of InceptionTime Model and it's use for classification and regression task.

    Some of the commits from the PR #253 are also in this PR, but will take care of them when that PR is merged.

    opened by codeboy5 9
  • Added Model for Time Series Classification

    Added Model for Time Series Classification

    I have added the code for a basic RNN Model for the task of time series classification.

    > data, blocks = load(datarecipes()["ecg5000"]);
    > task = TSClassificationSingle(blocks, data);
    > model = FastAI.taskmodel(task);
    > traindl, validdl = taskdataloaders(data, task, 32);
    > callbacks = [ToGPU(), Metrics(accuracy)];
    > learner = Learner(model, tasklossfn(task); data=(traindl, validdl), optimizer=ADAM(), callbacks = callbacks);
    > fitonecycle!(learner, 10, 0.033)
    
    image

    As I discussed with @darsnack, the idea is to add an encoding to do the reshaping to (features, batch, time) instead of doing it inside on the RNN Model. Working on that right now.

    Have remove the type from StackedLSTM as it was redundant.

    opened by codeboy5 9
  • FastAI seems very slow compared to

    FastAI seems very slow compared to "vanilla" Flux

    When I try to train a simple resnet on CIFAR10 dataset, FastAi seems very slow compared to Flux (≈ 9-19 times slower). It seems, it could be a garbage collector problem, because with Flux I can have a batch-size of 512, and with FastAI I can't exceed 128 without having a out of memory error.

    FastAI code:

    using FastAI
    using ResNet9 # Pkg.add(url = "https://github.com/a-r-n-o-l-d/ResNet9.jl", rev="v0.1.1")
    
    data, blocks = loaddataset("cifar10", (Image, Label))
    method = ImageClassificationSingle(blocks)
    model = resnet9(inchannels=3, nclasses=10, dropout=0.0)
    method = ImageClassificationSingle(blocks)
    learner = methodlearner(method, data; 
        lossfn=Flux.crossentropy,
        callbacks=[ToGPU()],
        batchsize=16,
        model=model,
        optimizer=Descent())
    
    @time fitonecycle!(learner, 5, 1f-3, pct_start=0.5, divfinal=100, div=100)
    

    Flux code:

    using Flux
    using Flux: DataLoader, onehotbatch
    using Augmentor
    using MLDatasets
    using ParameterSchedulers
    using ParameterSchedulers: Scheduler
    using ResNet9 # Pkg.add(url = "https://github.com/a-r-n-o-l-d/ResNet9.jl", rev="v0.1.1")
    
    normpip = SplitChannels() |> PermuteDims(3, 2, 1) |> ConvertEltype(Float32)
    
    labels = CIFAR10.classnames() .|> Symbol
    
    function datasets(batchsize)
        train = let
            x = CIFAR10.traintensor() |> CIFAR10.convert2image
            y = map(i -> labels[i + 1], CIFAR10.trainlabels())
            DataLoader((x, y), batchsize = batchsize, shuffle = true, partial = false)
        end
    
        test = let
            x = CIFAR10.testtensor() |> CIFAR10.convert2image
            y = map(i -> labels[i + 1], CIFAR10.testlabels())
            DataLoader((x, y), batchsize = batchsize)
        end
        
        train, test
    end
    
    function minibatch(x, y)
        h, w, n = size(x)
        xb = Array{Float32}(undef, w, h, 3, n)
        augmentbatch!(CPUThreads(), xb, x, normpip)
        yb = onehotbatch(y, labels)
        xb, yb
    end
    
    function train!(model, optimiser, nepochs)
        loss_hist = []
        loss(x, y) = Flux.crossentropy(model(x), y)
        ps = params(model)
        for e in 1:nepochs
            # Training phase
            tloss = 0
            trainmode!(model)
            for (x, y) ∈ train
                x, y = minibatch(x, y) |> gpu
                gs = gradient(ps) do
                    l = loss(x, y)
                    tloss += l
                    l
                end
                Flux.Optimise.update!(optimiser, ps, gs)
            end
            tloss /= length(train)
            # Validation phase
            testmode!(model)
            vloss = 0
            for (x, y) ∈ test
                x, y = minibatch(x, y) |> gpu
                vloss += loss(x, y)
            end
            vloss /= length(test)
            push!(loss_hist, (tloss, vloss))
        end
        
        loss_hist
    end
    
    train, test = datasets(16)
    nepochs = 5
    s = Triangle(λ0 = 1f-5, λ1 = 1f-3, period = nepochs * length(train))
    opt = Scheduler(s, Descent())
    model = resnet9(inchannels = 3, nclasses = 10, dropout = 0.0) |> gpu
    @time train!(model, opt, nepochs)
    

    Results on a RTX 2080 Ti: FastAI: 1841.008685 seconds (3.92 G allocations: 212.561 GiB, 59.59% gc time, 0.00% compilation time) Flux: 98.444806 seconds (106.49 M allocations: 16.643 GiB, 3.58% gc time, 2.58% compilation time)

    Results on a Quadro P5000: FastAI: 1574.714976 seconds (3.92 G allocations: 212.473 GiB, 11.08% gc time) Flux: 177.416636 seconds (105.55 M allocations: 16.639 GiB, 2.05% gc time, 1.42% compilation time)

    opened by a-r-n-o-l-d 9
  • Discriminative learning rates

    Discriminative learning rates

    Discriminative learning rates means using different learning rates for differents part of a model, so-called layer groups. This is used in fastai when finetuning models.

    enhancement numfocusgrant fastai-parity 
    opened by lorenzoh 9
  • word and character level word tokenizer.

    word and character level word tokenizer.

    Some miscellaneous changes with the tokenizer being the main focus. Implements the LearnBase getobs and nobs methods. Uses the WordTokenizers module.

    Closes #24

    opened by SamuelzXu 9
  • Add Container and Block for Text

    Add Container and Block for Text

    Tried starting at creating a simple textual recipe based on ImageFolders dataset recipe. This specifically works for imdb and similar datasets. Any feedback is highly appreciated.

    opened by Chandu-4444 8
  • Added Time Series Container and Block

    Added Time Series Container and Block

    Added Time Series Container and Block. Currently can only load univariate time series. This is work in progress for issue #155 . I was planning to add loaddataset function for such datasets. Currently all datasets have the same root URL :- "https://s3.amazonaws.com/fast-ai-" . For time series datasets the root url is different so i think we can proceed by add root_url field in the FastAIDataset structure. How does this sound ?

    opened by codeboy5 8
  • Problem in ResNet50 backbone of

    Problem in ResNet50 backbone of "Image segmentation" example

    I suspect that the following code line from the Image segmentation example

    backbone = Metalhead.ResNet50(pretrain=true).layers[1:end-3]
    

    is not doing what is intended since ResNet50 from Metalhead (https://github.com/darsnack/Metalhead.jl/tree/darsnack/vision-refactor) returns a 2 item Chain (backbone and head), and the 1:end-3 indexing returns an empty Chain.

    Funny enough, with the model return by methodmodel (basically a Conv((1,1), 3=>32)) the example still works and is able to produce some image segmentation (does it works like just a pixel color indexer?).

    I'd say the expected code should be something Metalhead.ResNet50(pretrain=true).layers[1].layers and I would open a PR, but I'm not sure since, with that, the example fails later in the training loop.

    opened by cdsousa 7
  • CompatHelper: bump compat for MLUtils to 0.4, (keep existing compat)

    CompatHelper: bump compat for MLUtils to 0.4, (keep existing compat)

    This pull request changes the compat entry for the MLUtils package from 0.2.6, 0.3 to 0.2.6, 0.3, 0.4. This keeps the compat entries for earlier versions.

    Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request.

    opened by github-actions[bot] 0
  • loading datasets fails under proxy, but Base.download works

    loading datasets fails under proxy, but Base.download works

    Package Version

    0.5.0

    Julia Version

    1.8.3

    OS / Environment

    Windows10

    Describe the bug

    The downloads do not work under proxy, although Base.download and Downloads.download works just fine. The HTTP_PROXY and HTTPS_PROXY are set properly, ENV["HTTP_PROXY"] = "http://127.0.0.1:3128".

    using FastAI
    
    imagenette2_url = "https://s3.amazonaws.com/fast-ai-imageclas/imagenette2-160.tgz"
    FastAI.load(datasets()["imagenette2-160"])
    
    Do you want to download the dataset from https://s3.amazonaws.com/fast-ai-imageclas/imagenette2-160.tgz to "D:\z_installed_programs\julia-depot\datadeps\fastai-imagenette2-160"?
    [y/n]
    y
    ERROR: HTTP.Exceptions.RequestError(HTTP.Messages.Request:
    """
    GET /fast-ai-imageclas/imagenette2-160.tgz HTTP/1.1
    Host: s3.amazonaws.com
    Accept: */*
    User-Agent: HTTP.jl/1.8.3
    Content-Length: 0
    
    [Message Body was streamed]""", Base.IOError("X509 - Certificate verification failed, e.g. CRL, CA or signature check failed", -9984))
    

    Using Downloads.download or Base.download works just fine under the same proxy conditions.

    import Downloads
    imagenette2_url = "https://s3.amazonaws.com/fast-ai-imageclas/imagenette2-160.tgz"
    Downloads.download(imagenette2_url, "imagenette2-160.tgz")
    

    Steps to Reproduce

    see above

    Expected Results

    see above

    Observed Results

    see above

    Relevant log output

    see above

    bug 
    opened by MariusDrulea 0
  • the dataset is deleted right after download in Windows10

    the dataset is deleted right after download in Windows10

    Package Version

    0.5.0

    Julia Version

    1.8.3

    OS / Environment

    Windows10

    Describe the bug

    I just run the following code to download the coco_sample dataset: FastAI.load(datasets()["coco_sample"]). The download is succesful. After the download 7zip is being called to unpack the archive. After the unzipping the following error occurs. It looks like the script tries to delete the folder it just created, fastai-coco_cample. This happens with all the datasets.

    ERROR: LoadError: IOError: rm("D:\\z_installed_programs\\julia-depot\\datadeps\\fastai-coco_sample"): resource busy or locked (EBUSY)
    

    Note that I have the julia's DEPOT_PATH environment variable set to D:\\z_installed_programs\\julia-depot, instead of the default home directory of the user.

    Steps to Reproduce

    using FastAI
    FastAI.load(datasets()["coco_sample"])
    

    Expected Results

    get the coco sample dataset on the PC

    Observed Results

    the archive of the coco sample is downloaded, the archive is unzipped, then the error occurs and then the fastai-coco_cample folder containing the archive and the unzipped data is deleted.

    Relevant log output

    ERROR: LoadError: IOError: rm("D:\\z_installed_programs\\julia-depot\\datadeps\\fastai-coco_sample"): resource busy or locked (EBUSY)
    Stacktrace:
      [1] uv_error
        @ .\libuv.jl:97 [inlined]
      [2] rm(path::String; force::Bool, recursive::Bool)
        @ Base.Filesystem .\file.jl:306
      [3] checkfor_mv_cp_cptree(src::String, dst::String, txt::String; force::Bool)
        @ Base.Filesystem .\file.jl:330
      [4] #mv#15
        @ .\file.jl:425 [inlined]
      [5] (::FastAI.Datasets.var"#10#11")(f::String)
        @ FastAI.Datasets D:\z_installed_programs\julia-depot\packages\FastAI\as9UG\src\datasets\fastaidatasets.jl:261
      [6] #16
        @ D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution_automatic.jl:122 [inlined]
      [7] cd(f::DataDeps.var"#16#17"{FastAI.Datasets.var"#10#11", String}, dir::String)
        @ Base.Filesystem .\file.jl:101
      [8] run_post_fetch(post_fetch_method::FastAI.Datasets.var"#10#11", fetched_path::String)
        @ DataDeps D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution_automatic.jl:119
      [9] download(datadep::DataDeps.DataDep{String, String, typeof(DataDeps.fetch_default), FastAI.Datasets.var"#10#11"}, localdir::String; remotepath::String, i_accept_the_terms_of_use::Nothing, skip_checksum::Bool)
        @ DataDeps D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution_automatic.jl:84
     [10] download
        @ D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution_automatic.jl:63 [inlined]
     [11] handle_missing
        @ D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution_automatic.jl:10 [inlined]
     [12] _resolve
        @ D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution.jl:83 [inlined]
     [13] resolve(datadep::DataDeps.DataDep{String, String, typeof(DataDeps.fetch_default), FastAI.Datasets.var"#10#11"}, inner_filepath::String, calling_filepath::String)
        @ DataDeps D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution.jl:29
     [14] resolve(datadep_name::String, inner_filepath::String, calling_filepath::String)
        @ DataDeps D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution.jl:54
     [15] resolve
        @ D:\z_installed_programs\julia-depot\packages\DataDeps\ae6dT\src\resolution.jl:73 [inlined]
     [16] makeavailable
        @ D:\z_installed_programs\julia-depot\packages\FastAI\as9UG\src\datasets\loaders.jl:46 [inlined]
     [17] loaddata(loader::FastAI.Datasets.DataDepLoader)
        @ FastAI.Datasets D:\z_installed_programs\julia-depot\packages\FastAI\as9UG\src\datasets\loaders.jl:50
     [18] (::FastAI.Registries.var"#8#13")(row::NamedTuple{(:id, :description, :size, :tags, :package, :downloaded, :loader), Tuple{String, Union{Missing, String}, Union{Missing, String}, Vector{String}, Module, Bool, FastAI.Datasets.DatasetLoader}})
        @ FastAI.Registries D:\z_installed_programs\julia-depot\packages\FastAI\as9UG\src\Registries\datasets.jl:38
     [19] load(entry::FeatureRegistries.RegistryEntry; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
        @ FeatureRegistries D:\z_installed_programs\julia-depot\packages\FeatureRegistries\FBMLI\src\registry.jl:135
     [20] load
        @ D:\z_installed_programs\julia-depot\packages\FeatureRegistries\FBMLI\src\registry.jl:135 [inlined]
    
    bug 
    opened by MariusDrulea 2
  • Add Metalhead.jl models to model registry

    Add Metalhead.jl models to model registry

    This populates the model registry from #267 with models from Metalhead.jl.

    Depends on #267 as well as unreleased [email protected] . Possibly supersedes https://github.com/FluxML/Metalhead.jl/pull/153.

    See #267 for usage.

    PR Checklist

    • [ ] Tests are added
    • [ ] Documentation, if applicable
    opened by lorenzoh 0
  • Add a feature registry for models

    Add a feature registry for models

    Implements #246.

    PR Checklist

    • [x] Tests are added
    • [x] Documentation (this is an internal change for now, docs will be added in follow-up when functionality is made available and used in domain package)

    Usage examples

    From #269:

    
    using FastAI: models
    # loading this adds the models to registry
    using FastVision
    
    
    # Load original model, 1000 output classes, no weights (`ResNet(18)`):
    load(models()["metalhead/resnet18"]);
    
    # Load original model, 1000 output classes, with weights (`ResNet(18), pretrain=true`):
    load(models()["metalhead/resnet18"], pretrained = true);
    
    # Load only backbone, without weights:
    load(models()["metalhead/resnet18"], variant = "backbone");
    
    # Load only backbone, with weights:
    load(models()["metalhead/resnet18"], pretrained = true, variant = "backbone");
    
    # Load model for task, adapting layers as necessary:
    task = ImageClassificationSingle((256, 256), 1:5, C = Gray{N0f8}) # input with 1 color channel, 5 classes
    load(models()["metalhead/resnet18"], input = task.blocks.x, output = task.blocks.y)
    # Also works with pretrained weights
    load(models()["metalhead/resnet18"], pretrained = true, input = task.blocks.x, output = task.blocks.y)
    
    # Correct variants are selected automatically given the blocks:
    load(models()["metalhead/resnet18"], output = FastAI.ConvFeatures)  # uses backbone variant
    
    
    # Support for multiple checkpoints, selectable by name:
    load(models()["metalhead/resnet18"], checkpoint = "imagenet1k")
    
    

    Docs

    The proposed interface is well-described by the registry description, pasted below:

    A FeatureRegistry for models. Allows you to find and load models for various learning tasks using a unified interface. Call models() to see a table view of available models:

    using FastAI
    models()
    

    Which models are available depends on the loaded packages. For example, FastVision.jl adds vision models from Metalhead to the registry. Index the registry with a model ID to get more information about that model:

    using FastAI: models
    using FastVision  # loading the package extends the list of available models
    
    models()["metalhead/resnet18"]
    

    If you've selected a model, call load to then instantiate a model:

    model = load("metalhead/resnet18")
    

    By default, load loads a default version of the model without any pretrained weights.

    load(model) also accepts keyword arguments that allow you to specify variants of the model and weight checkpoints that should be loaded.

    Loading a checkpoint of pretrained weights:

    • load(entry; pretrained = true): Use any pretrained weights, if they are available.
    • load(entry; checkpoint = "checkpoint-name"): Use the weights with given name. See entry.checkpoints for available checkpoints (if any).
    • load(entry; pretrained = false): Don't use pretrained weights

    Loading a model variant for a specific task:

    • load(entry; input = ImageTensor, output = OneHotLabel): Load a model variant matching an input and output block.
    • load(entry; variant = "backbone"): Load a model variant by name. Seeentry.variants` for available variants.
    opened by lorenzoh 2
Releases(v0.5.0)
  • v0.5.0(Oct 22, 2022)

    FastAI v0.5.0

    Diff since v0.4.3

    Closed issues:

    • Move datasets to MLDatasets.jl (#22)
    • MLUtils.jl transition (#196)
    • Functions getcoltypes and gettransformdict are not exported properly (#210)
    • Makie 0.17 support (#224)
    • Keypoint regression example: The input graph contains at least one loop (#231)
    • Log to TensorBoard link in TOC (#232)
    • Log to TensorBoard link in TOC (#233)
    • Docs aren't working correctly. (#237)
    • Make a subpackage for Makie support (#241)

    Merged pull requests:

    • Update TagBot.yml (#226) (@lorenzoh)
    • Improve onboarding experience (#227) (@lorenzoh)
    • Switch LearnBase + MLDataPattern + DataLoaders -> MLUtils (#229) (@lorenzoh)
    • Fix link to TensorBoard how-to (#234) (@lorenzoh)
    • Add Time Series Block (#239) (@codeboy5)
    • Move domain-specific functionality to subpackages (#240) (@lorenzoh)
    • CompatHelper: bump compat for UnicodePlots to 3, (keep existing compat) (#244) (@github-actions[bot])
    • Text classification task (#245) (@Chandu-4444)
    • Use Adam instead of ADAM (#247) (@lorenzoh)
    • CompatHelper: add new compat entry for TextAnalysis at version 0.7, (keep existing compat) (#249) (@github-actions[bot])
    • Added Model for Time Series Classification (#253) (@codeboy5)
    • InceptionTime Model for Time Series (#256) (@codeboy5)
    • Fix the broken link in the README (#257) (@nathanaelbosch)
    • CompatHelper: bump compat for PrettyTables to 2, (keep existing compat) (#260) (@github-actions[bot])
    • Fix _segmentationloss for 3D images (#261) (@itan1)
    • Update Pollen.jl documentation (#262) (@lorenzoh)
    • Fix UNet for 3D convolutions (specify ndim to convxlayer and ResBlock) (#263) (@itan1)
    Source code(tar.gz)
    Source code(zip)
  • v0.4.3(May 14, 2022)

    FastAI v0.4.3

    Diff since v0.4.2

    Closed issues:

    • Register Everything (#206)

    Merged pull requests:

    • Add Container and Block for Text (#207) (@Chandu-4444)
    • Feature registries (#208) (@lorenzoh)
    • CompatHelper: add new compat entry for FeatureRegistries at version 0.1, (keep existing compat) (#225) (@github-actions[bot])
    Source code(tar.gz)
    Source code(zip)
  • v0.4.2(Apr 30, 2022)

    FastAI v0.4.2

    Diff since v0.4.1

    Closed issues:

    • Keypoint Regression example in documentation (#218)
    • Documentation link broken for Custom Learning tasks (#220)

    Merged pull requests:

    • Keypoint regression example (#221) (@itan1)
    • Update to new Pollen template (#222) (@lorenzoh)
    • Add FluxTraining 0.3 compatibility (#223) (@lorenzoh)
    Source code(tar.gz)
    Source code(zip)
  • v0.4.1(Apr 20, 2022)

    FastAI v0.4.1

    Diff since v0.4.0

    Closed issues:

    • Support for non-supervised learning tasks (#165)
    • LoadError in some pages of documentation. (#192)
    • Drop DLPipelines.jl (#197)
    • Update for Flux 0.13 (#201)
    • Error in reproducing the "Data containers" tutorial: "key :nothing not found" (#209)
    • Missing ProgressBars.jl import for Vision.imagedatasetstats (#214)
    • Use JpegTurbo.jl to load .jpg images (#216)

    Merged pull requests:

    • Add Flux 0.13 compatibility (#202) (@lorenzoh)
    • New documentation frontend (#203) (@lorenzoh)
    • Pollen docs update part 2 (#213) (@lorenzoh)
    • Ports over _predictx from DLPipelines.jl (#215) (@lorenzoh)
    • Fix progress bar in imagedatasetstats (#217) (@lorenzoh)
    • add ImageIO backend (#219) (@johnnychen94)
    Source code(tar.gz)
    Source code(zip)
  • v0.4.0(Mar 19, 2022)

    FastAI v0.4.0

    Diff since v0.3.0

    Closed issues:

    • N-dimensional CNN models (#137)
    • Better visualization/interpretation API (#154)
    • FastAI seems very slow compared to "vanilla" Flux (#187)

    Merged pull requests:

    • change julia compat to 1.6 (#175) (@EvoArt)
    • Add support for 3D convolutional backbones (#181) (@lorenzoh)
    • Move domain-specific functionality to submodules (#186) (@lorenzoh)
    • Make block learning methods more modular (#188) (@lorenzoh)
    • CompatHelper: bump compat for LearnBase to 0.6, (keep existing compat) (#189) (@github-actions[bot])
    • Make UNet closer to fastai (#190) (@lorenzoh)
    • CompatHelper: bump compat for CSV to 0.10, (keep existing compat) (#191) (@github-actions[bot])
    • Add imdb_sample recipe (#193) (@Chandu-4444)
    • Add food-101 recipe (#194) (@Chandu-4444)
    • Any-length dimensions for Bounded (#195) (@lorenzoh)
    • Removing DLPipelines.jl; Learning method -> Learning task (#198) (@lorenzoh)
    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Dec 11, 2021)

    FastAI v0.3.0

    Diff since v0.2.0

    v0.3.0

    Added

    • A new API for visualizing data. See this issue for motivation. This includes:

      • High-level functions for visualizing data related to a learning method: showsample, showsamples, showencodedsample, showencodedsamples, showbatch, showprediction, showpredictions, showoutput, showoutputs, showoutputbatch
      • Support for multiple backends, including a new text-based show backend that you can use to visualize data in a non-graphical environment. This is also the default unless Makie is imported.
      • Functions for showing blocks directly: showblock, showblocks
      • Interfaces for extension: ShowBackend, showblock!, showblocks!

    Removed

    • The old visualization API incl. all its plot* methods: plotbatch, plotsample, plotsamples, plotpredictions

    Closed issues:

    • Visualization functions are not working (#184)

    Merged pull requests:

    • CompatHelper: bump compat for CSV to 0.9, (keep existing compat) (#168) (@github-actions[bot])
    • New interpretation/Visualization API (#176) (@lorenzoh)
    • CompatHelper: add new compat entry for InlineTest at version 0.2, (keep existing compat) (#177) (@github-actions[bot])
    • CompatHelper: add new compat entry for ImageInTerminal at version 0.4, (keep existing compat) (#178) (@github-actions[bot])
    • CompatHelper: add new compat entry for Requires at version 1, (keep existing compat) (#179) (@github-actions[bot])
    • CompatHelper: add new compat entry for UnicodePlots at version 2, (keep existing compat) (#180) (@github-actions[bot])
    • Make Only more generic (#182) (@lorenzoh)
    • v0.3.0 (#185) (@lorenzoh)
    Source code(tar.gz)
    Source code(zip)
  • v0.2.0(Sep 21, 2021)

    FastAI v0.2.0

    Diff since v0.1.0

    0.2.0

    Added

    Changed

    • Documentation sections to reference FasterAI interfaces:
    • Breaking changes to methodlearner:
      • now accepts callbacks as kwarg
      • validdata no longer keyword
      • model and backbone now kwargs; isbackbone removed. if neither backbone or model are given, uses blockbackbone for default backbone.
      • see updated docstring for details

    Closed issues:

    • FasterAI: a roadmap to user-friendly, high-level interfaces (#148)
    • Makie crashes precompilation (#156)
    • Problem in ResNet50 backbone of "Image segmentation" example (#169)

    Merged pull requests:

    • Add tabular model (#124) (@manikyabard)
    • Add tabular learning methods (#141) (@manikyabard)
    • Fix Pkg usage in developing.md (#149) (@amqdn)
    • Remove unused notebooks in docs (#150) (@amqdn)
    • FasterAI (#151) (@lorenzoh)
    • Fix WrapperBlock behavior and add Many (#158) (@lorenzoh)
    • Documentation for FasterAI updates (#161) (@lorenzoh)
    • CompatHelper: add new compat entry for "Setfield" at version "0.7" (#162) (@github-actions[bot])
    • Adding some more dataset recipes (#163) (@lorenzoh)
    • Breaking API changes to methodlearner (#164) (@lorenzoh)
    • CompatHelper: bump compat for IndirectArrays to 1, (keep existing compat) (#166) (@github-actions[bot])
    • update get_emb_sz method (#167) (@manikyabard)
    • Better docstrings for data block functions (#170) (@lorenzoh)
    • Setup step for encodings (#171) (@lorenzoh)
    • CompatHelper: bump compat for Setfield to 0.8, (keep existing compat) (#172) (@github-actions[bot])
    • Fix #169 by constructing pretrained backbone correctly (#173) (@lorenzoh)
    • Release v0.2.0 (#174) (@lorenzoh)
    Source code(tar.gz)
    Source code(zip)
  • v0.1.0(Jul 28, 2021)

    FastAI v0.1.0

    Closed issues:

    • First pass at datasets and data loaders (#1)
    • Thank you! (#7)
    • MIT vs Apache license (#8)
    • Some maintenance and housekeeping things (#9)
    • Contributions? (#15)
    • Tutorial errors out (#19)
    • Add TableDataset (#23)
    • Learning Method: Single-label image classification (#25)
    • Learning method: Image segmentation (#29)
    • Learning Method: Keypoint Regression (#30)
    • Use case: Siamese networks for image similarity (#31)
    • Improve projective data augmentation (#32)
    • Learning Method: Multi-label image classification (#33)
    • Weight decay option in fitonecycle! (#34)
    • Discriminative learning rates (#35)
    • Augmentor.jl (#68)
    • Data Block API (#135)
    • Malformed FluxTraining compat requirement in Project.toml (#138)
    • Two FluxTraining entry in Project.toml (#139)

    Merged pull requests:

    • CompatHelper: add new compat entry for "Revise" at version "2.7" (#2) (@github-actions[bot])
    • CompatHelper: add new compat entry for "StatsBase" at version "0.33" (#3) (@github-actions[bot])
    • CompatHelper: add new compat entry for "Infiltrator" at version "0.3" (#4) (@github-actions[bot])
    • CompatHelper: add new compat entry for "Flux" at version "0.11" (#5) (@github-actions[bot])
    • CompatHelper: add new compat entry for "Zygote" at version "0.5" (#6) (@github-actions[bot])
    • Simplied Recorder and Metric (#10) (@opus111)
    • Use DocStringExtensions to remove manual types and signatures (#11) (@ToucheSir)
    • New README for FastAI (#14) (@opus111)
    • FastAI.jl revamp (#17) (@lorenzoh)
    • update docs deps (#18) (@lorenzoh)
    • Fix the Quickstart tutorial and make the docs refer to this repo instead of a fork (#20) (@dave7895)
    • added TableDataset (#26) (@manikyabard)
    • tutorial errors: changed {load->get}classesclassification (#58) (@SamuelzXu)
    • move docs to Pollen (#60) (@lorenzoh)
    • Lo/fix test (#66) (@lorenzoh)
    • Better vision augmentations (#67) (@lorenzoh)
    • CompatHelper: bump compat for "LearnBase" to "0.4" (#113) (@github-actions[bot])
    • CompatHelper: bump compat for "MosaicViews" to "0.3" (#115) (@github-actions[bot])
    • Move to DataAugmentation v0.2.0 (#116) (@lorenzoh)
    • Small fixes (#117) (@lorenzoh)
    • WIP: Ongoing development (#118) (@lorenzoh)
    • Develop (#119) (@lorenzoh)
    • CompatHelper: add new compat entry for "ShowCases" at version "0.1" (#120) (@github-actions[bot])
    • CompatHelper: add new compat entry for "JLD2" at version "0.4" (#121) (@github-actions[bot])
    • Replace AbstractPlotting.jl with Makie.jl (#122) (@lorenzoh)
    • CompatHelper: add new compat entry for "Makie" at version "0.13" (#123) (@github-actions[bot])
    • CompatHelper: bump compat for "Makie" to "0.14" (#125) (@github-actions[bot])
    • Docs: how-to for logging (#126) (@lorenzoh)
    • Docs: tutorial on dataset presizing (#127) (@lorenzoh)
    • CompatHelper: add new compat entry for "CSV" at version "0.8" (#128) (@github-actions[bot])
    • CompatHelper: add new compat entry for "DataFrames" at version "1" (#129) (@github-actions[bot])
    • Update keypoint regression tutorial to include custom learning method and plotting functions. (#130) (@lorenzoh)
    • Remove all reference to LearningTask. (#131) (@lorenzoh)
    • Update to FluxTraining.jl v0.2.0 interfaces (#134) (@lorenzoh)
    • Data block API (#136) (@lorenzoh)
    • Get ready for release of 0.1.0 (#145) (@lorenzoh)
    Source code(tar.gz)
    Source code(zip)
Owner
FluxML
The Elegant Machine Learning Stack
FluxML
Pose Transformers: Human Motion Prediction with Non-Autoregressive Transformers

Pose Transformers: Human Motion Prediction with Non-Autoregressive Transformers This is the repo used for human motion prediction with non-autoregress

Idiap Research Institute 26 Dec 14, 2022
Use unsupervised and supervised learning to predict stocks

AIAlpha: Multilayer neural network architecture for stock return prediction This project is meant to be an advanced implementation of stacked neural n

Vivek Palaniappan 1.5k Jan 06, 2023
Yolox-bytetrack-sample - Python sample of MOT (Multiple Object Tracking) using YOLOX and ByteTrack

yolox-bytetrack-sample YOLOXとByteTrackを用いたMOT(Multiple Object Tracking)のPythonサン

KazuhitoTakahashi 12 Nov 09, 2022
Neural network chess engine trained on Gary Kasparov's games.

Neural Chess It's not the best chess engine, but it is a chess engine. Proof of concept neural network chess engine (feed-forward multi-layer perceptr

3 Jun 22, 2022
Awesome-google-colab - Google Colaboratory Notebooks and Repositories

Unofficial Google Colaboratory Notebook and Repository Gallery Please contact me to take over and revamp this repo (it gets around 30k views and 200k

Derek Snow 1.2k Jan 03, 2023
Vis2Mesh: Efficient Mesh Reconstruction from Unstructured Point Clouds of Large Scenes with Learned Virtual View Visibility ICCV2021

Vis2Mesh This is the offical repository of the paper: Vis2Mesh: Efficient Mesh Reconstruction from Unstructured Point Clouds of Large Scenes with Lear

71 Dec 25, 2022
Bag of Tricks for Natural Policy Gradient Reinforcement Learning

Bag of Tricks for Natural Policy Gradient Reinforcement Learning [ArXiv] Setup Python 3.8.0 pip install -r req.txt Mujoco 200 license Main Files main.

Brennan Gebotys 1 Oct 10, 2022
Educational API for 3D Vision using pose to control carton.

Educational API for 3D Vision using pose to control carton.

41 Jul 10, 2022
arxiv-sanity, but very lite, simply providing the core value proposition of the ability to tag arxiv papers of interest and have the program recommend similar papers.

arxiv-sanity, but very lite, simply providing the core value proposition of the ability to tag arxiv papers of interest and have the program recommend similar papers.

Andrej 671 Dec 31, 2022
Code to reproduce the results for Compositional Attention

Compositional-Attention This repository contains the official implementation for the paper Compositional Attention: Disentangling Search and Retrieval

Sarthak Mittal 58 Nov 30, 2022
A curated list of awesome deep long-tailed learning resources.

A curated list of awesome deep long-tailed learning resources.

vanint 210 Dec 25, 2022
Scikit-learn compatible estimation of general graphical models

skggm : Gaussian graphical models using the scikit-learn API In the last decade, learning networks that encode conditional independence relationships

213 Jan 02, 2023
JAX bindings to the Flatiron Institute Non-uniform Fast Fourier Transform (FINUFFT) library

JAX bindings to FINUFFT This package provides a JAX interface to (a subset of) the Flatiron Institute Non-uniform Fast Fourier Transform (FINUFFT) lib

Dan Foreman-Mackey 32 Oct 15, 2022
Lite-HRNet: A Lightweight High-Resolution Network

LiteHRNet Benchmark 🔥 🔥 Based on MMsegmentation 🔥 🔥 Cityscapes FCN resize concat config mIoU last mAcc last eval last mIoU best mAcc best eval bes

16 Dec 12, 2022
HistoKT: Cross Knowledge Transfer in Computational Pathology

HistoKT: Cross Knowledge Transfer in Computational Pathology Exciting News! HistoKT has been accepted to ICASSP 2022. HistoKT: Cross Knowledge Transfe

Mahdi S. Hosseini 5 Jan 05, 2023
An Api for Emotion recognition.

PLAYEMO Playemo was built from the ground-up with Flask, a python tool that makes it easy for developers to build APIs. Use Cases Is Python your langu

greek geek 2 Jul 16, 2022
This is the official repository of Music Playlist Title Generation: A Machine-Translation Approach.

PlyTitle_Generation This is the official repository of Music Playlist Title Generation: A Machine-Translation Approach. The paper has been accepted by

SeungHeonDoh 6 Jan 03, 2022
Dilated Convolution with Learnable Spacings PyTorch

Dilated-Convolution-with-Learnable-Spacings-PyTorch Ismail Khalfaoui Hassani Dilated Convolution with Learnable Spacings (abbreviated to DCLS) is a no

15 Dec 09, 2022
A Real-Time-Strategy game for Deep Learning research

Description DeepRTS is a high-performance Real-TIme strategy game for Reinforcement Learning research. It is written in C++ for performance, but provi

Centre for Artificial Intelligence Research (CAIR) 156 Dec 19, 2022
reimpliment of DFANet: Deep Feature Aggregation for Real-Time Semantic Segmentation

DFANet This repo is an unofficial pytorch implementation of DFANet:Deep Feature Aggregation for Real-Time Semantic Segmentation log 2019.4.16 After 48

shen hui xiang 248 Oct 21, 2022