The Fastest DNN Running Framework on Web Browser

Overview

CircleCI

WebDNN: Fastest DNN Execution Framework on Web Browser

WebDNN is an open source software framework for executing deep neural network (DNN) pre-trained model on web browser.

WebDNN can execute DNN models trained by follow deep learning frameworks on web browser.

Why is WebDNN needed?

Deep neural network (DNN) is getting much attention to use in many applications. However, it requires a lot of computational resources, and there are many tremendous processes to setup execution environment based hardware acceleration such as GPGPU. Therefore providing DNN applications to end-users is very hard.

WebDNN solves this problem by using web browser as installation-free DNN execution framework. This framework optimizes trained DNN model to compress the model data and accelerate the execution, and executes it with novel JavaScript API such as WebAssembly and WebMetal to achieve zero-overhead execution. Empirical evaluations showed that it achieved more than 200x acceleration.

Note: WebGPU introduced by Apple was renamed to WebMetal in 2019. In WebDNN 1.2.8, both WebMetal and old name WebGPU are supported for compatiblity. For string constant, currently webgpu is used, but will be changed to webmetal in the future version.

Performance

  • Compared processing time with Keras.js
  • Test environment:
    • Mac Book Pro early 2015
    • macOS 10.12.4 Sierra
    • Intel Core i5 2.7 GHz CPU
    • 16 GB Memory
    • Intel Iris Graphics 6100 GPU
    • Safari Technology Preview 30
  • Model: VGG16[1], Inception-v3[4], and ResNet50[2].
  • Input Shape: (1, 299, 299, 3) for Inception-v3, (1, 224, 224, 3) for others.

Benchmark result with Keras.js

Elapsed time per image are shown in vertical axis as logarithmic scale.

WebDNN with WebMetal backend was significantly faster than Keras.js. WebDNN with WebAssembly backend was comparable with GPU backend of Keras.js. In each DNN model and backend, WebDNN obtained better results in terms of speed. More speed improvement is observed when the optimizations are applied in the graph transpiler.

Getting started in 30 seconds

Let's convert and execute ResNet50 pre-trained Keras model[3] on your web browser.

First, save ResNet50 pre-trained model provided by Keras.

from keras.applications import resnet50
model = resnet50.ResNet50(include_top=True, weights='imagenet')
model.save("resnet50.h5")

Next, convert the model by CLI. In this phase, model is optimized.

python ./bin/convert_keras.py resnet50.h5 --input_shape '(1,224,224,3)' --out output

Then, generated files (called as Descriptor) can be loaded and executed by JavaScript as follows,

let runner, image, probabilities;

async function init() {
    // Initialize descriptor runner
    runner = await WebDNN.load('./output');
    image = runner.inputs[0]; 
    probabilities = runner.outputs[0];
}

async function run() {
    // Set the value into input variable.
    image.set(await WebDNN.Image.getImageArray('./input_image.png'));
    
    // Run
    await runner.run(); 

    // Show the result
    console.log('Output', WebDNN.Math.argmax(probabilities));
}

WebDNN also supports Caffemodel and Chainer model.

For more information, please see documents.

Setup

Please see documents.

Also, Docker image is provided. See docker.

Applications / demos using WebDNN


  • [1] Karen Simonyan and Andrew Zisserman. 2014. Very Deep Convolutional Networks for Large-Scale Image Recognition. In Proceedings of the International Conference on Learning Representations (ICLR).
  • [2] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2015. Deep Residual Learning for Image Recognition. In Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR). https://github.com/KaimingHe/deep-residual-networks
  • [3] Applications - Keras Documentation
  • [4] Christian Szegedy, Vincent Vanhoucke, Sergey Ioffe, Jon Shlens, and Zbigniew Wojna. 2016. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR).
Comments
  • How to use Keras layer implemented by myself?

    How to use Keras layer implemented by myself?

    (Written by @Kiikurage) Original question is in Japanese, but answer is also written in English.


    (英語のほうがよければ英語で書きますのでその旨お知らせください)

    Kerasには入力の各点に学習可能なバイアスを加えるレイヤーがないようなので、カスタムレイヤー(Bias)を作りました。 このレイヤーを含むKerasモデルをWebDNNで利用するにはどうしたらよいでしょうか?

    ちなみにBiasは以下のような簡単なものです。

    from keras.engine.topology import Layer
    from keras import initializers
    
    class Bias(Layer):
        """
        Custom keras layer that simply adds a scalar bias to each location in the input
        """
        
        def __init__(self, initializer='uniform', **kwargs):
            super(Bias, self).__init__(**kwargs)
            self.initializer = initializers.get(initializer)
        
        def build(self, input_shape):
            self.bias = self.add_weight(name='{}_W'.format(self.name), shape=(input_shape[-1],), initializer=self.initializer)
        
        def call(self, x):
            return x + self.bias
    
    opened by y-ich 27
  • Numerical Problem in WebGL Backend

    Numerical Problem in WebGL Backend

    I try to run our model with WebGL backend but I find that the result is wrong.

    Our generator model looks like

    I try to print calculation results from both WASM and WebGL backends.

    I find that two vectors(64 * 128 * 128 feature maps) only differ 0.08 in total before the last conv layer, but after the last conv layer, the total difference becomes 54474...

    I upload some debug files here model_before_last_conv is our model without the last conv layer(also without tanh) model_after_last_conv is our model with the last conv layer(without tanh) model_last_conv only contains the last conv layer, I do not know how to debug it independently

    Let me know if you need any further information

    opened by Aixile 23
  • U-Net seems not work

    U-Net seems not work

    I tried to use pix2pix implementation. https://github.com/knok/vess2ret

    The main network architecture is U-Net, concatinating each layers outputs in encoder side to same level layers in decoder side.

    WebDNN seems not supported this feature. I tried to implement _convert_conv2d_transpose, itself seems work but input shape is not matched. Maybe Concatenate layers are ignored?

     % python ~/drive/webdnn/bin/convert_keras.py --input_shape '(1, 512, 512, 3)' --out webdnn_output atob_unet.h5
    Using TensorFlow backend.
    [convert_keras.py] Generating feedforward graph
    (snip)
    /opt/p36/lib/python3.6/site-packages/webdnn-1.1.0-py3.6.egg/webdnn/util/console.
    py:27: Warning: [KerasConverter] omitting dropout
      warnings.warn(message, category)
    Traceback (most recent call last):
      File "/home/knok/drive/webdnn/bin/convert_keras.py", line 113, in <module>
        main()
      File "/home/knok/drive/webdnn/bin/convert_keras.py", line 63, in main
        graph = converter.convert(model)
      File "/opt/p36/lib/python3.6/site-packages/webdnn-1.1.0-py3.6.egg/webdnn/frontend/keras/converter.py", line 108, in convert
        self._convert_operator(node.outbound_layer)
      File "/opt/p36/lib/python3.6/site-packages/webdnn-1.1.0-py3.6.egg/webdnn/frontend/keras/converter.py", line 141, in _convert_operator
        return super(KerasConverter, self)._convert_operator(k_op)
      File "/opt/p36/lib/python3.6/site-packages/webdnn-1.1.0-py3.6.egg/webdnn/frontend/converter.py", line 108, in _convert_operator
        self._handler_map[self.__class__.__name__][operator_key](self, operator)
      File "/opt/p36/lib/python3.6/site-packages/webdnn-1.1.0-py3.6.egg/webdnn/frontend/keras/layers/convolutional.py", line 92, in _convert_conv2d_transpose
        y, = Deconvolution2D(None, ksize=ksize, stride=stride, padding=padding)(x, w) 
      File "/opt/p36/lib/python3.6/site-packages/webdnn-1.1.0-py3.6.egg/webdnn/graph/operators/deconvolution2d.py", line 62, in __call__
        "Input and Kernel variables of Deconvolution2D must be same channel size: " \
    AssertionError: Input and Kernel variables of Deconvolution2D must be same channel size: x.shape_dict[Axis.C]=1024, w.shape_dict[Axis.C]=512
    

    The following is the patch for Conv2DTranspose:

    --- a/src/graph_transpiler/webdnn/frontend/keras/layers/convolutional.py
    +++ b/src/graph_transpiler/webdnn/frontend/keras/layers/convolutional.py
    @@ -6,6 +6,7 @@ except ImportError as e:
     from webdnn.frontend.keras.converter import KerasConverter
     from webdnn.frontend.keras.layers.util import do_activation
     from webdnn.graph.operators.convolution2d import Convolution2D
    +from webdnn.graph.operators.deconvolution2d import Deconvolution2D
     from webdnn.graph.operators.zero_padding_1d import ZeroPadding1D
     from webdnn.graph.operators.zero_padding_2d import ZeroPadding2D
     from webdnn.graph.order import OrderC, OrderNCHW, OrderNHWC, OrderHWCN, OrderNT
    C
    @@ -65,8 +66,36 @@ def _convert_separable_conv2d(converter: KerasConverter, k_op
    : "keras.layers.Sep
     # noinspection PyUnusedLocal
     @KerasConverter.register_handler("Conv2DTranspose")
     def _convert_conv2d_transpose(converter: KerasConverter, k_op: "keras.layers.Co
    nv2DTranspose"):
    -    # TODO
    -    raise NotImplementedError('[KerasConverter] keras.layers.Conv2DTranspose is
     not supported')
    +    x = converter.get_variable(converter.get_input_tensor(k_op)[0])
    +    if k_op.data_format == "channels_first":
    +        assert x.order == OrderNCHW
    +
    +    elif k_op.data_format == "channels_last":
    +        assert x.order == OrderNHWC
    +
    +    else:
    +        raise ValueError(f"[KerasConverter] Unknown data format is detected: {k
    _op.data_format}")
    +    
    +    w = converter.convert_to_constant_variable(k_op.kernel, OrderHWCN)
    +
    +    ksize = tuple(k_op.kernel_size)
    +    stride = tuple(k_op.strides)
    +    if k_op.padding == "valid":
    +        padding = (0, 0)
    +
    +    elif k_op.padding == "same":
    +        padding = (ksize[0] // 2, ksize[1] // 2)
    +
    +    w = converter.convert_to_constant_variable(k_op.kernel, OrderHWCN)
    +
    +    ksize = tuple(k_op.kernel_size)
    +    stride = tuple(k_op.strides)
    +    if k_op.padding == "valid":
    +        padding = (0, 0)
    +
    +    elif k_op.padding == "same":
    +        padding = (ksize[0] // 2, ksize[1] // 2)
    +
    +    else:
    +        raise ValueError(f"[KerasConverter] Unknown padding: {k_op.padding}")
    +
    +    y, = Deconvolution2D(None, ksize=ksize, stride=stride, padding=padding)(x, 
    w)
    +    if k_op.use_bias:
    +        b = converter.convert_to_constant_variable(k_op.bias, OrderC)
    +        y = y + b
    +
    +    y = do_activation(k_op.activation, y)
    +    converter.set_variable(converter.get_output_tensor(k_op)[0], y)
     
     
     # noinspection PyUnusedLocal
    
    opened by knok 13
  • Cannot convert keras model: softmax() got an unexpected keyword argument 'axis'

    Cannot convert keras model: softmax() got an unexpected keyword argument 'axis'

    I am following the instructions to convert a resnet50 model word by word. And I get this:

    D:\repos\webdnn\resnet>python ..\bin\convert_keras.py resnet50.h5 --input_shape "(1,224,224,3)" --out output --backend webgl
    C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
      from ._conv import register_converters as _register_converters
    Using TensorFlow backend.
    [convert_keras.py] Generating feedforward graph
    2018-09-22 07:53:38.725911: I C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX AVX2
    Traceback (most recent call last):
      File "..\bin\convert_keras.py", line 114, in <module>
        main()
      File "..\bin\convert_keras.py", line 61, in main
        model = keras.models.load_model(args.kerasmodel, custom_objects=custom_objects, compile=False)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\engine\saving.py", line 260, in load_model
        model = model_from_config(model_config, custom_objects=custom_objects)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\engine\saving.py", line 334, in model_from_config
        return deserialize(config, custom_objects=custom_objects)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\layers\__init__.py", line 55, in deserialize
        printable_module_name='layer')
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\utils\generic_utils.py", line 145, in deserialize_keras_object
        list(custom_objects.items())))
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\engine\network.py", line 1027, in from_config
        process_node(layer, node_data)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\engine\network.py", line 986, in process_node
        layer(unpack_singleton(input_tensors), **kwargs)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\engine\base_layer.py", line 457, in __call__
        output = self.call(inputs, **kwargs)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\layers\core.py", line 878, in call
        output = self.activation(output)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\activations.py", line 29, in softmax
        return K.softmax(x)
      File "C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py", line 3154, in softmax
        return tf.nn.softmax(x, axis=axis)
    TypeError: softmax() got an unexpected keyword argument 'axis'
    

    The model was saved using these instructions:

    (base) D:\repos\webdnn\resnet>python
    Python 3.6.5 |Anaconda, Inc.| (default, Mar 29 2018, 13:32:41) [MSC v.1900 64 bit (AMD64)] on win32
    Type "help", "copyright", "credits" or "license" for more information.
    >>> from keras.applications import resnet50
    C:\Users\jeff\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
      from ._conv import register_converters as _register_converters
    Using TensorFlow backend.
    >>> model = resnet50.ResNet50(include_top=True, weights='imagenet')
    2018-09-22 07:27:17.054697: I T:\src\github\tensorflow\tensorflow\core\platform\cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
    Downloading data from https://github.com/fchollet/deep-learning-models/releases/download/v0.2/resnet50_weights_tf_dim_ordering_tf_kernels.h5
    102858752/102853048 [==============================] - 7s 0us/step
    >>> model.save("resnet50.h5")
    >>> exit()
    
    opened by jeffsaremi 10
  • [Question] Can I check how many inputs I can feed efficiently?

    [Question] Can I check how many inputs I can feed efficiently?

    Hi.

    In my understanding, WebDNN model can be feeded several inputs at once if GPU has enough capacity. Is that right?

    And if so, I want to know how many inputs I can feed efficiently on the target GPU. Is it possible?

    Thanks.

    opened by y-ich 10
  • TypeError: Placeholder#value must be a int, not '<class 'webdnn.graph.placeholder.Placeholder'>'

    TypeError: Placeholder#value must be a int, not ''

    Using TensorFlow backend.
    Here To import all
    [convert_keras.py] Generating feedforward graph
    2017-11-28 22:22:25.563469: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
    2017-11-28 22:22:25.563555: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
    Traceback (most recent call last):
      File "./webdnn/bin/convert_keras.py", line 117, in <module>
        main()
      File "./webdnn/bin/convert_keras.py", line 67, in main
        graph = converter.convert(model)
      File "E:\CI-Cor-Ready\ai\face-demo-work\webdnn\webdnn\src\graph_transpiler\webdnn\frontend\keras\converter.py", line 105, in convert
        return self._convert_fallback(model)
      File "E:\CI-Cor-Ready\ai\face-demo-work\webdnn\webdnn\src\graph_transpiler\webdnn\frontend\keras\converter.py", line 115, in _convert_fallback
        v.shape[0].value = self._batch_size
      File "E:\CI-Cor-Ready\ai\face-demo-work\webdnn\webdnn\src\graph_transpiler\webdnn\graph\placeholder.py", line 443, in value
        raise TypeError(f"Placeholder#value must be a int, not '{type(new_v)}'")
    TypeError: Placeholder#value must be a int, not '<class 'webdnn.graph.placeholder.Placeholder'>'
    
    opened by lygstate 8
  • Error generating output graph with keras Resnet Example

    Error generating output graph with keras Resnet Example

    When I run the command to generate the graph from Resnet example (python bin/convert_keras.py resnet50.h5 --input_shape '(1,224,224,3)' --out output) I get the following error:

    screen shot 2017-06-15 at 17 52 24

    Can you help me resolve this error?

    opened by nikhilaravi 7
  • can't convert a simple LeNet from Pytorch even after removing pooling layers.

    can't convert a simple LeNet from Pytorch even after removing pooling layers.

    I have a very simple very standard LeNet in pytorch and I can't convert it to webdnn. The pooling layer give an error with dilation attribute. even after removing them and leaving just convolution I get a similar error but this time from attribute transA.

    File "", line 1, in runfile('/Users/alirezagoudarzi/github/Aya-s-Brain/WebDNN_Test/pytorch/train_model.py', wdir='/Users/alirezagoudarzi/github/Aya-s-Brain/WebDNN_Test/pytorch')

    File "/Users/alirezagoudarzi/anaconda/envs/py36/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 705, in runfile execfile(filename, namespace)

    File "/Users/alirezagoudarzi/anaconda/envs/py36/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 102, in execfile exec(compile(f.read(), filename, 'exec'), namespace)

    File "/Users/alirezagoudarzi/github/Aya-s-Brain/WebDNN_Test/pytorch/train_model.py", line 142, in graph = PyTorchConverter().convert(net, dummy_input)

    File "/Users/alirezagoudarzi/anaconda/envs/py36/lib/python3.6/site-packages/webdnn-1.2.3-py3.6.egg/webdnn/frontend/pytorch/converter.py", line 92, in convert graph = ONNXConverter().convert(onnx.load(proto_path))

    File "/Users/alirezagoudarzi/anaconda/envs/py36/lib/python3.6/site-packages/webdnn-1.2.3-py3.6.egg/webdnn/frontend/onnx/converter.py", line 94, in convert self._convert_operator(onnx_op)

    File "/Users/alirezagoudarzi/anaconda/envs/py36/lib/python3.6/site-packages/webdnn-1.2.3-py3.6.egg/webdnn/frontend/onnx/converter.py", line 114, in _convert_operator super(ONNXConverter, self)._convert_operator(proto)

    File "/Users/alirezagoudarzi/anaconda/envs/py36/lib/python3.6/site-packages/webdnn-1.2.3-py3.6.egg/webdnn/frontend/converter.py", line 117, in _convert_operator self._handler_map[self.class.name][operator_key](self, operator)

    File "/Users/alirezagoudarzi/anaconda/envs/py36/lib/python3.6/site-packages/webdnn-1.2.3-py3.6.egg/webdnn/frontend/onnx/defs/math.py", line 370, in _convert_gemm y, = Tensordot(None, axes=(A.order.axes[0 if attrs["transA"].i else 1], B.order.axes[1 if attrs["transB"].i else 0]))(A, B)

    KeyError: 'transA'

    opened by alirezag 6
  • "Use with Keras Model" in Tutorioal does not work properly

    I'd like to know how to solve the following problem. I've tried a tutorial "Use with Keras Model" as shell command and python script, but both would result in an error like:

    AttributeError: 'Model' object has no attribute 'nodes_by_depth'
    

    I really appreciate your help.

    bug 
    opened by tomoyukilabs 6
  • convert Lambda

    convert Lambda

    I'm trying to convert the yolo_v2 model to webdnn but ran into an error with a Lambda layer - it seems the '_convert_lambda' function has not yet been implemented:

    @KerasConverter.register_handler("Lambda")
    def _convert_lambda(converter: KerasConverter, k_op: "keras.layers.Lambda"):
        # TODO
        raise NotImplementedError('[KerasConverter] keras.layers.Lambda is not supported')
    

    Would you be able to implement this or suggest how we can implement it ourselves?

    We're working on a project to showcase NN with JS using WedDNN for NodeConf Argentina on October 26th so would greatly appreciate a speedy response!

    opened by nikhilaravi 6
  • How to interpret output values

    How to interpret output values

    First, thanks for the amazing toolkit! :)

    I have a two-class classifier on small grayscale images, trained with Keras and converted to run on WebGL with WebDNN. When I evaluate the model in Keras, it gives me the output [ 0.16141078 0.83858919]. But when I evaluate the model on the same data with WebDNN it gives me [-309.3228759765625, 169.71974182128906]. The values are ordered correctly (argmax still works), but the values are incorrect. I thought maybe they were pre-softmax values, but that doesn't seem correct.

    Next, I thought that maybe the problem was that I trained my network on images with pixels in the range 0-1, but maybe WebDNN expects pixels in the range 0-255. So I tried adding the scale: [255] option when setting the input for the network. Changing this gives me [-294116294656, 666073825280], and now the predictions are no longer correct.

    How can I interpret the output of the network, and convert these values to the same predictions I get from Keras?

    Thanks!

    opened by kylemcdonald 6
  • Bump json5 from 2.2.0 to 2.2.3

    Bump json5 from 2.2.0 to 2.2.3

    Bumps json5 from 2.2.0 to 2.2.3.

    Release notes

    Sourced from json5's releases.

    v2.2.3

    v2.2.2

    • Fix: Properties with the name __proto__ are added to objects and arrays. (#199) This also fixes a prototype pollution vulnerability reported by Jonathan Gregson! (#295).

    v2.2.1

    • Fix: Removed dependence on minimist to patch CVE-2021-44906. (#266)
    Changelog

    Sourced from json5's changelog.

    v2.2.3 [code, diff]

    v2.2.2 [code, diff]

    • Fix: Properties with the name __proto__ are added to objects and arrays. (#199) This also fixes a prototype pollution vulnerability reported by Jonathan Gregson! (#295).

    v2.2.1 [code, diff]

    • Fix: Removed dependence on minimist to patch CVE-2021-44906. (#266)
    Commits
    • c3a7524 2.2.3
    • 94fd06d docs: update CHANGELOG for v2.2.3
    • 3b8cebf docs(security): use GitHub security advisories
    • f0fd9e1 docs: publish a security policy
    • 6a91a05 docs(template): bug -> bug report
    • 14f8cb1 2.2.2
    • 10cc7ca docs: update CHANGELOG for v2.2.2
    • 7774c10 fix: add proto to objects and arrays
    • edde30a Readme: slight tweak to intro
    • 97286f8 Improve example in readme
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 0
  • Bump loader-utils from 2.0.0 to 2.0.4

    Bump loader-utils from 2.0.0 to 2.0.4

    Bumps loader-utils from 2.0.0 to 2.0.4.

    Release notes

    Sourced from loader-utils's releases.

    v2.0.4

    2.0.4 (2022-11-11)

    Bug Fixes

    v2.0.3

    2.0.3 (2022-10-20)

    Bug Fixes

    • security: prototype pollution exploit (#217) (a93cf6f)

    v2.0.2

    2.0.2 (2021-11-04)

    Bug Fixes

    • base64 generation and unicode characters (#197) (8c2d24e)

    v2.0.1

    2.0.1 (2021-10-29)

    Bug Fixes

    Changelog

    Sourced from loader-utils's changelog.

    2.0.4 (2022-11-11)

    Bug Fixes

    2.0.3 (2022-10-20)

    Bug Fixes

    • security: prototype pollution exploit (#217) (a93cf6f)

    2.0.2 (2021-11-04)

    Bug Fixes

    • base64 generation and unicode characters (#197) (8c2d24e)

    2.0.1 (2021-10-29)

    Bug Fixes

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 0
  • Bump minimatch from 3.0.4 to 3.1.2

    Bump minimatch from 3.0.4 to 3.1.2

    Bumps minimatch from 3.0.4 to 3.1.2.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 0
  • Bump terser from 5.7.0 to 5.14.2

    Bump terser from 5.7.0 to 5.14.2

    Bumps terser from 5.7.0 to 5.14.2.

    Changelog

    Sourced from terser's changelog.

    v5.14.2

    • Security fix for RegExps that should not be evaluated (regexp DDOS)
    • Source maps improvements (#1211)
    • Performance improvements in long property access evaluation (#1213)

    v5.14.1

    • keep_numbers option added to TypeScript defs (#1208)
    • Fixed parsing of nested template strings (#1204)

    v5.14.0

    • Switched to @​jridgewell/source-map for sourcemap generation (#1190, #1181)
    • Fixed source maps with non-terminated segments (#1106)
    • Enabled typescript types to be imported from the package (#1194)
    • Extra DOM props have been added (#1191)
    • Delete the AST while generating code, as a means to save RAM

    v5.13.1

    • Removed self-assignments (varname=varname) (closes #1081)
    • Separated inlining code (for inlining things into references, or removing IIFEs)
    • Allow multiple identifiers with the same name in var destructuring (eg var { a, a } = x) (#1176)

    v5.13.0

    • All calls to eval() were removed (#1171, #1184)
    • source-map was updated to 0.8.0-beta.0 (#1164)
    • NavigatorUAData was added to domprops to avoid property mangling (#1166)

    v5.12.1

    • Fixed an issue with function definitions inside blocks (#1155)
    • Fixed parens of new in some situations (closes #1159)

    v5.12.0

    • TERSER_DEBUG_DIR environment variable
    • @​copyright comments are now preserved with the comments="some" option (#1153)

    v5.11.0

    • Unicode code point escapes (\u{abcde}) are not emitted inside RegExp literals anymore (#1147)
    • acorn is now a regular dependency

    v5.10.0

    • Massive optimization to max_line_len (#1109)
    • Basic support for import assertions
    • Marked ES2022 Object.hasOwn as a pure function
    • Fix delete optional?.property
    • New CI/CD pipeline with github actions (#1057)

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 0
  • Bump protobufjs from 6.11.2 to 6.11.3

    Bump protobufjs from 6.11.2 to 6.11.3

    Bumps protobufjs from 6.11.2 to 6.11.3.

    Release notes

    Sourced from protobufjs's releases.

    v6.11.3

    6.11.3 (2022-05-20)

    Bug Fixes

    Changelog

    Sourced from protobufjs's changelog.

    6.11.3 (2022-05-20)

    Bug Fixes

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 0
  • Bump minimist from 1.2.5 to 1.2.6

    Bump minimist from 1.2.5 to 1.2.6

    Bumps minimist from 1.2.5 to 1.2.6.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 0
Releases(v1.2.11)
  • v1.2.11(Jan 17, 2020)

  • v1.2.10(Jun 28, 2019)

    Features:

    • implement sqrt operator (#920)
    • implement Normalize (#921)

    Bugfix:

    • Support Module.buffer removal in emscripten (v1.38.30) (#923)

    Please note that v1.2.9 has issue on build, so skipped.

    Source code(tar.gz)
    Source code(zip)
  • v1.2.8(Apr 3, 2019)

  • v1.2.7(Nov 19, 2018)

    This update includes follow improvements:

    • Support latest pytorch's alexnet (#894), resnet (#900)
    • Workaround for stack overflow for large model (#900)
    Source code(tar.gz)
    Source code(zip)
  • v1.2.6(Aug 22, 2018)

    This update includes follow bug-fixes:

    • Fix getImageArrayFromDrawable (#850) thanks to @marcorighini
    • Fix unroll_concat when the number of input is odd (#853)
    • Fix IE11 support (#856,#874)
    • Fix issue with multiple networks due to global transformUrlDelegate (#858) thanks to @alessandrolenzi
    • Support emscripten v1.38.1 (#880)
    Source code(tar.gz)
    Source code(zip)
  • v1.2.5(May 24, 2018)

    Supported Chainer v4 (#832).

    This update includes follow bug-fixes:

    • ImageData handling in getImageArray (#838)
    • Apply transformUrlDelegate to wasm file (#842)
    Source code(tar.gz)
    Source code(zip)
  • v1.2.4(May 6, 2018)

  • v1.1.0(Jul 1, 2017)

    Highlights

    Support Dynamic Computation Graph

    Dynamic computation graph like RNN is supported, and some interfaces in descriptor runner are changed.

    //v1.0.0:
    let runner = await WebDNN.prepareAll(modelPath);
    let x = await runner.getInputViews()[0];
    let y = await runner.getOutputViews()[0];
    
    x.set(loadImageData());
    
    await runner.run()
    
    print('result:', y);
    
    //v1.1.0:
    
    // "WebDNN.prepareAll()" is removed. Please use "WebDNN.load"
    let runner = await WebDNN.load(modelPath);
    
    // You can can get input and output views synchronously
    let x = runner.getInputViews()[0];
    let y = runner.getOutputViews()[0];
    
    // You can modify dynamic hyper parameter at run-time (ex: length of input time series)
    runner.setPlaceholderValue({ T: 8 });
    
    x.set(loadTextData());
    
    await runner.run();
    
    //Because "runner.getInputViews()" and "runner.getOutputViews()" return "SymbolicArrayBufferView",
    //To get actual ArrayBufferView, you need to convert them explicitly.
    print('result:', y.toActual());
    

    (#305, #306, #307, #320, #338)

    Update Documents

    Reference document is updated (#342)

    Support WebGPU Backend in iOS

    From iOS 11, WebGPU is also supported in mobile safari (with experimental flag). WebDNN supports these environment. (#330)

    Add Operators

    Fix Many Bugs

    Enormous number of bugs are fixed!

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(May 28, 2017)

Owner
Machine Intelligence Laboratory (The University of Tokyo)
Machine Intelligence Laboratory (The University of Tokyo)
Deep Learning in Javascript. Train Convolutional Neural Networks (or ordinary ones) in your browser.

ConvNetJS ConvNetJS is a Javascript implementation of Neural networks, together with nice browser-based demos. It currently supports: Common Neural Ne

Andrej 10.4k Dec 31, 2022
architecture-free neural network library for node.js and the browser

Synaptic Important: Synaptic 2.x is in stage of discussion now! Feel free to participate Synaptic is a javascript neural network library for node.js a

Juan Cazala 6.9k Dec 27, 2022
Run Keras models in the browser, with GPU support using WebGL

**This project is no longer active. Please check out TensorFlow.js.** The Keras.js demos still work but is no longer updated. Run Keras models in the

Leon Chen 4.9k Dec 29, 2022
JavaScript API for face detection and face recognition in the browser and nodejs with tensorflow.js

face-api.js JavaScript face recognition API for the browser and nodejs implemented on top of tensorflow.js core (tensorflow/tfjs-core) Click me for Li

Vincent Mühler 14.6k Jan 2, 2023
A library for prototyping realtime hand detection (bounding box), directly in the browser.

Handtrack.js View a live demo in your browser here. Handtrack.js is a library for prototyping realtime hand detection (bounding box), directly in the

Victor Dibia 2.7k Jan 3, 2023
Train and test machine learning models for your Arduino Nano 33 BLE Sense in the browser.

Tiny Motion Trainer Train and test IMU based TFLite models on the Web Overview Since 2009, coders have created thousands of experiments using Chrome,

Google Creative Lab 59 Nov 21, 2022
This is a JS/TS library for accelerated tensor computation intended to be run in the browser.

TensorJS TensorJS How to use Tensors Tensor operations Reading values Data types Converting between backends Onnx model support Optimizations Running

Frithjof Winkelmann 32 Jun 26, 2022
Bayesian bandit implementation for Node and the browser.

#bayesian-bandit.js This is an adaptation of the Bayesian Bandit code from Probabilistic Programming and Bayesian Methods for Hackers, specifically d3

null 44 Aug 19, 2022
Simple Javascript implementation of the k-means algorithm, for node.js and the browser

#kMeans.js Simple Javascript implementation of the k-means algorithm, for node.js and the browser ##Installation npm install kmeans-js ##Example (JS)

Emil Bay 44 Aug 19, 2022
Clustering algorithms implemented in Javascript for Node.js and the browser

Clustering.js ####Clustering algorithms implemented in Javascript for Node.js and the browser Examples License Copyright (c) 2013 Emil Bay github@tixz

Emil Bay 29 Aug 19, 2022
A straightforward framework built for automatic proctoring to create online tests, effortlessly

A straightforward framework built for automatic proctoring to create online tests, effortlessly. Explore the docs » Architecture · Features · Local Se

Tushar Nankani 24 Oct 22, 2022
Friendly machine learning for the web! 🤖

Read our ml5.js Code of Conduct and software licence here! This project is currently in development. Friendly machine learning for the web! ml5.js aim

ml5 5.9k Jan 2, 2023
⚡️The Fullstack React Framework — built on Next.js

The Fullstack React Framework "Zero-API" Data Layer — Built on Next.js — Inspired by Ruby on Rails Read the Documentation “Zero-API” data layer lets y

⚡️Blitz 12.5k Jan 4, 2023
Node is running but you don't know why? why-is-node-running is here to help you.

why-is-node-running Node is running but you don't know why? why-is-node-running is here to help you. Installation Node 8 and above: npm i why-is-node-

Mathias Buus 1.5k Dec 30, 2022
Jugglr is a tool for managing test data and running tests with a dedicated database running in a Docker container.

Jugglr Jugglr is a tool for managing test data and running tests with a lightweight, dedicated database. Jugglr enables developers, testers, and CI/CD

OSLabs Beta 74 Aug 20, 2022
The Omnibookmarks browser extension is the fastest way to open bookmarks

★ Omnibookmarks The Omnibookmarks browser extension is the fastest way to open bookmarks. Just type a keyword into the address bar to quickly open or

Nate Hill 16 Aug 20, 2022
A frontend Framework for building B2B applications running in the browser on top of REST/GraphQL APIs, using ES6, React and Material Design

react-admin A frontend Framework for building data-driven applications running in the browser on top of REST/GraphQL APIs, using ES6, React and Materi

marmelab 21.2k Dec 30, 2022
An animation library, built on the Web Animations API for the smallest filesize and the fastest performance

motion-signals A wrapper over Motion One, An animation library, built on the Web Animations API for the smallest filesize and the fastest performance.

Tanvesh Sarve 48 Dec 27, 2022
An efficient (and the fastest!) way to search the web privately using Brave Search Engine

Brave Search An efficient (and the fastest) way to search the web privately using Brave Search Engine. Not affiliated with Brave Search. Tested on Chr

Jishan Shaikh 7 Jun 2, 2022
OpenUI5 lets you build enterprise-ready web applications, responsive to all devices, running on almost any browser of your choice.

OpenUI5. Build Once. Run on any device. What is it? OpenUI5 lets you build enterprise-ready web applications, responsive to all devices, running on al

SAP 2.7k Dec 31, 2022