Dev here. Javascript engines (especially Chromium) have a memory limit (as per performance.memory.jsHeapSizeLimit), in best case scenarios, 4GB max. LocalStorage and SessionStorage (JS features that would be used to store the neural network weights and training data) have even lower limits. While I fear that locally AI-driven advertisement could happen in a closer future, it’s not currently technically feasible in current Chromium (Chrome, Vivaldi, Edge, Opera, etc) and Gecko (Firefox) implementations.
Then Alphabet will come up with a new bullshit idea, “remove the limits for ‘trusted’ advertisers” so that they can inject more code than allowed as long as they keep paying for their ad “partnership”
It became difficult as Web technologies grown complexier, such as implementing native CPU instructions through WASM, bluetooth through Web Bluetooth, 3D graphics through WebGL, NFC, motion sensors, serial ports, and so on. Nowadays, it’s simply too hard to maintain a browser engine, because many of the former alternatives were abandoned and became deprecated.
Actually, there is a compilation of all the standards specifications. It’s on W3 (World Wide Web Consortium), where all the technical details are deeply documented (called “Technical Reports”), available on https://www.w3.org/TR/ . To this day, there are 309 published Technical Reports regarding “Standard” specifications.
Fun fact: while seeking for the link to send here, I came across a Candidate Standard entitled “Web Neural Network API”, published exactly yesterday. Seems like they’re intending to implement browser-native neural network capabilities inside Web specifications, and seems like the “closer future” I mentioned is even closer… 🤔
This is purely a nitpick, but WASM lets you run WASM instructions not native cpu instructions. Its does let you get much closer to the speed of running native instructions
Not yet, but I often code myself some experiments involving datasets (i like to experiment with Natural Language Processing, randomness, programmatic art and demoscenes, the list goes on).
I’ve made exactly two projects that utilized canvas, both of which I “released” in a sense. One contains 248kb of JS code and the other contains 246kb. That’s before it’s minified.
So I guess that means I did my canvas code right. Lol.
(Unless you meant 3d canvas or WebGL stuff with which I haven’t played.)
It would just slowly accumulate it over time, little bit here, little bit there until it has a fleet of stuff to serve you in a queue, so while you’re making more and more bits for more videos, it’s serving you videos while you make bits of new videos and sharing them over websockets that JS CDNS force-feed our browsers to centralized servers to offload similar users with similar ad-tastes to also help compile.
Some shit like that. Adtech is cyber terrorism. Never forget.
Dev here. Javascript engines (especially Chromium) have a memory limit (as per
performance.memory.jsHeapSizeLimit
), in best case scenarios, 4GB max.LocalStorage
andSessionStorage
(JS features that would be used to store the neural network weights and training data) have even lower limits. While I fear that locally AI-driven advertisement could happen in a closer future, it’s not currently technically feasible in current Chromium (Chrome, Vivaldi, Edge, Opera, etc) and Gecko (Firefox) implementations.Then Alphabet will come up with a new bullshit idea, “remove the limits for ‘trusted’ advertisers” so that they can inject more code than allowed as long as they keep paying for their ad “partnership”
That’s why we need to fight against chromium monopoly
It became difficult as Web technologies grown complexier, such as implementing native CPU instructions through WASM, bluetooth through Web Bluetooth, 3D graphics through WebGL, NFC, motion sensors, serial ports, and so on. Nowadays, it’s simply too hard to maintain a browser engine, because many of the former alternatives were abandoned and became deprecated.
I dare anyone to even just compile a document containing all the standards you’d need to implement
Actually, there is a compilation of all the standards specifications. It’s on W3 (World Wide Web Consortium), where all the technical details are deeply documented (called “Technical Reports”), available on https://www.w3.org/TR/ . To this day, there are 309 published Technical Reports regarding “Standard” specifications.
Fun fact: while seeking for the link to send here, I came across a Candidate Standard entitled “Web Neural Network API”, published exactly yesterday. Seems like they’re intending to implement browser-native neural network capabilities inside Web specifications, and seems like the “closer future” I mentioned is even closer… 🤔
This is purely a nitpick, but WASM lets you run WASM instructions not native cpu instructions. Its does let you get much closer to the speed of running native instructions
Exactly… Including themselves, as they are a major player in advertising market (Google Adsense).
I really hope you don’t know about this 4GB limit specifically because you’ve run up against it while doing anything real-world.
Not yet, but I often code myself some experiments involving datasets (i like to experiment with Natural Language Processing, randomness, programmatic art and demoscenes, the list goes on).
Canvas code can get out of hand very quickly if not done right
I’ve made exactly two projects that utilized canvas, both of which I “released” in a sense. One contains 248kb of JS code and the other contains 246kb. That’s before it’s minified.
So I guess that means I did my canvas code right. Lol.
(Unless you meant 3d canvas or WebGL stuff with which I haven’t played.)
I think they’re referring to the memory footprint, not the source code file size.
Code size isn’t really related to how much graphics data you’re throwing in RAM
It would just slowly accumulate it over time, little bit here, little bit there until it has a fleet of stuff to serve you in a queue, so while you’re making more and more bits for more videos, it’s serving you videos while you make bits of new videos and sharing them over websockets that JS CDNS force-feed our browsers to centralized servers to offload similar users with similar ad-tastes to also help compile.
Some shit like that. Adtech is cyber terrorism. Never forget.