I don’t think it’s ADD. There’s a book called ‘thinking fast and slow’. In that book the psychologist separates the mind functions into two systems. System 1 is for intuition, no effort, fast thinking. System 2 needs effort, slow, but precise. What happens here is that simply people are trying to be efficient with their thinking and they use less system 2 which is required for reading.
- 3 Posts
- 103 Comments
This won’t protect your .env files though, right?
Right, but my machine is safe at least.
It’s possible. For pnpm package cache you need to attach another volume, and another for globally installed packages.
Keep your secrets:
alias npm="docker run -it --rm -v $(pwd):/app -w /app node:latest npm"Not enough, but better than nothing.
fxdave@lemmy.mlto
No Stupid Questions@lemmy.world•How could AI be better than an encyclopedia?
1·2 months agoThe concept of understanding implies some form of meta-knowledge about the subject.
That can be solved if you teach it the meta-knowledge with intermediary steps, for example:
prompt: 34*3= step1: 4*3 + 30*3 = step2: 12 + 10*3*3 = step3: 12 + 10*9= step4: 12 + 90 = step5: 100 + 2 = step6: 102 result: 102It’s hard to find such learning data though, but e.g. claude already uses intermediary steps. It preprocesses your input multiple times. It writes code, runs code to process your input, and that’s still not the final response. Unfortunately, it’s already smarter than some junior developers, and its consequence is worrying.
fxdave@lemmy.mlto
No Stupid Questions@lemmy.world•How could AI be better than an encyclopedia?
2·2 months agoBut LLMs are not simply probabilistic machines. They are neural nets. For sure, they haven’t seen the world. They didn’t learn the way we learn. What they mean by a caterpillar is just a vector. For humans, that’s a 3D, colorful, soft object with some traits.
You can’t expect that a being that sees chars and produces chars knows what we mean by a caterpillar. Their job is to figure out the next char. But you could expect them to understand some grammar rules. Although, we can’t expect them to explain the grammar.
For another example, I wrote a simple neural net, and with 6 neurons it could learn XOR. I think we can say that it understands XOR. Can’t we? Or would you say then that an XOR gate understands XOR better? I would not use the word understand for something that cannot learn. But why wouldn’t we use it for a NN?
fxdave@lemmy.mlto
No Stupid Questions@lemmy.world•How could AI be better than an encyclopedia?
11·2 months agoAny explanation? If they can write text, I assume they understand grammar. They are definetly skilled in a way. If you do snowboarding, do you understand snowboarding? The word “understand” can be misleading. That’s why I’m asking what’s understanding?
fxdave@lemmy.mlto
No Stupid Questions@lemmy.world•How could AI be better than an encyclopedia?
18·2 months agoWhat’s understanding? Isn’t understanding just a consequence of neurons communicating with each other? This case LLMs with deep learning can understand things.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
1·2 months agoWhat I meant is that you cannot turn any existing webpages to a basic page with some simple tricks like disabling js. That would be a never-ending fight.
You are the one adding extra complexity
I’m not the one defining the business requirement. I could build a site with true progressive enhancement. It’s just extra work, because the requirement is a modern page with actions, modals, notifications, etc.
There are two ways I can fulfill this. SSR with scripts that feel like hacks. Or CSR. I choose CSR, but then progressive enhancement is now an extra work.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
1·2 months agoWhy is it “impossible to do them reliably” - without js presumably?
What I meant is that you cannot turn any existing webpages to a basic page with some simple tricks like disabling js. That would be a never-ending fight.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
1·2 months agoIt suggests using minimal js, I use react the same way, whatever I can do with css, I do it with css. But I am not going to footgun myself. I start the app with react because at some point I will need react.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
1·2 months agoIt seems you misunderstood me.
There were horrible tricks and hacks that were addig not only ux improvements but useful content. We used jquery for many of those things. That’s why I wrote it, and for the legacy vibe.
Disabling js would have broken that site as well, reinforcing my point that it was never a reliable solution to disable js.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
1·2 months agoI was developing with Laravel until 2019. I agree that you can write clean code with it. Still, there are many better options nowadays, I switched to nodejs because I can use typescript for both backend and frontend, and I’m happier with it. Although js is not a great language, typescript is almost perfect. But it’s not only me who switched, people ditching php because there are better options.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
21·3 months agoThe only non-heated comment. I appreciate it. I will read it.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
25·3 months agoI can’t take it seriously because of the noise in your text like “Huh?”. If you like to have a conversation, please be more open next time.
Source code is the code before some kind of transpilation. Obfuscated code is not source code.
I get it, you just need the content. But why would you reload the page when you’re just about to get the next news in the page. Isn’t it better to just update that part?
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
44·3 months agoWho said making unbloated pages impossible? Your comment would be more serious without your emotions.
Source code is the source code which gets transformed to some target code. An obfuscated code is not source code.
A reminder, in the past, large pages downloaded all stuff at once. In contrast, with dynamic imports the first load is much much faster. And that matters most. And any changes in dynamic content would just require the dynamic data to be downloaded. My phone lasts at least 2 days with one charge (avg usage), but I charge it every night, that’s not an issue.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
38·3 months agoYou were completely fine with slow page reloads blinding you when the theme was dark. I’m speaking to those who appreciate modern tech.
But anyways, unfortunately javascript obfuscation is a common thing.
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
921·3 months agoAs a web developer, I see js as a quality improvement. No page reloads, nice smooth ui. Luckily, PHP times has ended, but even in the PHP era disabling jQuery could cause problems.
We could generate static html pages It just adds complexity.
Personally I use only client-side rendering, and I think, that’s the best from dev perspective. Easy setup, no magic, nice ui. And that results in blank page when you disable js.
If your motivation is to stop tracking.
- replace all foreign domain sources to file uris. e.g.: load google fonts from local cache.
- disable all foreign script files unless it’s valid like js packages from public CDNs, which case load them from local cache.
If your motivation is to see old html pages, with minimal style, well it’s impossible to do them reliably. If you are worried about closed-source js. You shouldn’t be. It’s an isolated environment. if something is possible for js and you want to limit its capability, contribute to browsers. That’s the clear path.
I can be convinced. What’s your motivation?
fxdave@lemmy.mlto
Mildly Infuriating@lemmy.world•The sheer amount of websites that are completely unusable without JavaScriptEnglish
222·3 months agoI’m a webdev. I agree. I like react.




Downvoting because the title says “best” and I disagree. Apple products have a bunch of drawbacks, I wouldn’t buy them even if the hardware is strong and efficient.