_______               __                   _______
       |   |   |.---.-..----.|  |--..-----..----. |    |  |.-----..--.--.--..-----.
       |       ||  _  ||  __||    < |  -__||   _| |       ||  -__||  |  |  ||__ --|
       |___|___||___._||____||__|__||_____||__|   |__|____||_____||________||_____|
                                                             on Gopher (inofficial)
   URI Visit Hacker News on the Web
       
       
       COMMENT PAGE FOR:
   URI   Show HN: Jax-JS, array library in JavaScript targeting WebGPU
       
       
        forgotpwd16 wrote 6 hours 34 min ago:
        Very nice work. Like how it supports webgpu but also cpu/wasm/webgl.
        Would love to read more on the internals & design choices made like
        e.g. ref counting in README.
        
        P.S. And thanks for taking your time working on this and releasing
        something polished rather a Claude slop made within few days as seems
        to be the norm now.
       
        fouronnes3 wrote 16 hours 40 min ago:
        Congrats on the launch! This is a very exciting project because the
        only decent autodiff implementation in typescript was tensorflowjs,
        which has been completely abandonned by Google. Everyone uses onnx
        runtime web for inference but actually computing gradients in
        typescript was surprisingly absent from the ecosystem since tfjs died.
        
        I will be following this project closely! Best of luck Eric! Do you
        have plans to keep working on it for sometime? Is it a side project or
        will you abe ble to commit to jax-js longer term?
       
          ekzhang wrote 16 hours 27 min ago:
          Yes, we are actively working on it! The goal is to be a full ML
          research library, not just a model inference runtime. You can join
          the Discord to follow along
       
        maelito wrote 17 hours 20 min ago:
        Could not run the demos on Firefox. On Chromium, the Great Expectations
        loads but then nothing happens.
       
          ekzhang wrote 16 hours 55 min ago:
          Firefox doesn’t support WebGPU yet, you can run programs in the
          REPL through other backends like Wasm/WebGL: [1] See:
          
   URI    [1]: https://jax-js.com/repl
   URI    [2]: https://caniuse.com/webgpu
       
            forgotpwd16 wrote 6 hours 41 min ago:
            According to page WebGPU supported (`dom.webgpu.enabled` flag) but
            is only enabled by default on Windows & macOS (i.e. not Linux).
       
        bobajeff wrote 18 hours 15 min ago:
        This is really great. I don't do ML stuff. But I some mathy things that
        would benefit from running in the GPU so it's great to see the Web
        getting this.
        
        I hope this will help grow the js science community.
       
        sestep wrote 18 hours 45 min ago:
        Hey Eric, great to see you've now published this! I know we chatted
        about this briefly last year, but it would be awesome to see how the
        performance of jax-js compares against that of other autodiff tools on
        a broader and more standard set of benchmarks:
        
   URI  [1]: https://github.com/gradbench/gradbench
       
          ekzhang wrote 17 hours 45 min ago:
          For sure! It looks like this is benchmarking the autodiff cpu time,
          not the actual kernels though, which (correct me if I’m wrong)
          isn’t really relevant for an ML library — it’s more for if you
          have a really complex scientific expression
       
            sestep wrote 16 hours 52 min ago:
            Nope, both are measured! In fact, the time to do the autodiff
            transformation isn't even reflected in the charts shown on the
            README and the website; those charts only show the time to actually
            run the computations.
       
              ekzhang wrote 16 hours 48 min ago:
              Hm okay, seems like an interesting set of benchmarks — let me
              know if there’s anything I can do to help make jax-js more
              compatible with your docker setup
       
                sestep wrote 16 hours 45 min ago:
                It should be fairly straightforward; feel free to open a PR
                following the instructions in CONTRIBUTING.md :)
       
                  ekzhang wrote 14 hours 42 min ago:
                  I don’t think this is straightforward but it may be a skill
                  issue on my part. It would require dockerizing headless
                  Chrome with WebGPU support and dynamically injecting custom
                  bundled JavaScript into the page, then extracting the results
                  with Chrome IPC
       
                    sestep wrote 12 hours 40 min ago:
                    Ahh no you're right, I forgot about the difficulties for
                    GPU specifically; apologies for my overly curt earlier
                    message. More accurately: I think this is definitely
                    possible (Troels and I have talked a bit about this
                    previously) and I'd be happy to work together if this is
                    something you're interested in. I probably won't work on
                    this if you're not interested on your end, though.
       
        yuppiemephisto wrote 18 hours 51 min ago:
        This project is an inspiration, I've been working on porting tinygrad
        to [Lean](github.com/alok/tinygrad)
       
        mlajtos wrote 18 hours 52 min ago:
        I have a project using tfjs and jax-js is very exciting alternative.
        However during porting I struggle a lot with `.ref` and `.dispose()`
        API. Coming from tfjs where you garbage collect with `tf.tidy(() => {
        ... })`, API in jax-js seems very low-level and error-prone. Is that
        something that can be improved or is it inherent to how jax-js works?
        
        Would `using`[0] help here?
        
        [0]:
        
   URI  [1]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refere...
       
          ekzhang wrote 17 hours 42 min ago:
          I don’t think tf.tidy() is a sound API under jvp/grad
          transformations, also it prevents you from using async which makes it
          incompatible with GPU backends (or blocks the page), a pretty big
          issue. [1] Thanks for the feedback though, just explaining how we
          arrived at this API. I hope you’d at least try it out — hopefully
          you will see when developing that the refs are more flexible than
          alternatives.
          
   URI    [1]: https://github.com/tensorflow/tfjs/issues/5468
       
            mlajtos wrote 7 hours 27 min ago:
            I'll grind jax-js more and see if refs become invisible then.
            Thanks for a great project!
       
        esafak wrote 19 hours 4 min ago:
        What is the state of web ML? Anybody doing cool things already? How
        about [1] ?
        
   URI  [1]: https://www.w3.org/TR/webnn/
       
          sroussey wrote 17 hours 43 min ago:
          onnx on the web has the most models available and can use webgpu
          which is available everywhere.
          
          Huggingface’s transformers.js uses it. And I use that for [1] (also
          tensorflow mediapipe though that is using wasm).
          
          I don’t think webnn has gone anywhere and is too restrictive.
          
   URI    [1]: https://workglow.dev
       
            ekzhang wrote 17 hours 40 min ago:
            Since ONNX is just a model data format, you can actually parse and
            run ONNX files in jax-js as well. Here’s an example of running
            DETR ResNet-50 from Xenova’s transformers.js checkpoint in jax-js
            [1] I don’t think I intend to support everything in ONNX right
            now, especially quant/dequant, but eventually it would be
            interesting to see if we can help accelerate transformers.js with a
            jax-js backend + goodies like kernel fusion
            
            jax-js is more trying to explore being an ML research library,
            rather than ONNX which is a runtime for exported models
            
   URI      [1]: https://jax-js.com/detr-resnet-50
       
       
   DIR <- back to front page