Oh okay so if I sell the EU oil in Euros that’s no big deal then? Not gonna come kill me?
- 0 Posts
- 143 Comments
Aria@lemmygrad.mlto
Technology@lemmy.ml•Domain registrar NameCheap bans Zionism Observer website two weeks promoting new Israeli linked CEO Hillan Klein
6·9 days agoThere’s a website called NameCheap where you can buy website names. (For example lemmy.ml)
The website Zionism Observer had done that, and they used their website to document statements made by Israeli officials. Primarily genocidal statements and confessions to crimes.
NameCheap decided to take the website down, which they can do as they are responsible for the paperwork and admin required to have the website function.
Two weeks ago, NameCheap got a new CEO, Hillan Klein, who is linked to Israel.
Aria@lemmygrad.mlto
Asklemmy@lemmy.ml•Narrowing it down, should I live in the Netherlands, Portugal, or Spain?
7·10 days agoThe idea of learning three languages through an app at once in a short period is silly. Which one of these languages do you actually stand a chance at becoming fluent in? Are you already conversational? It’s a much lower bar to be able to hold a conversation with someone who’s trying to teach you the language and is patient with you than what’s needed for every day life. But if you can meet that bar today you’ll probably be able to learn the language well too.
Aria@lemmygrad.mlto
Asklemmy@lemmy.ml•Narrowing it down, should I live in the Netherlands, Portugal, or Spain?
53·10 days agoI assure you the Dutch speak Dutch. Many Dutch people also know English because it’s a mandatory subject in school, but if I said that in the UK they don’t speak English, and that they all speak French, would you think I was accurately describing the situation in the UK?
Aria@lemmygrad.mlto
Asklemmy@lemmy.ml•Would you join a Lemmy server hosted by your govt that requires an ID to post comments or content? (lurkers no ID required)
21·3 months agoYes but why Lemmy? There are more mature forum softwares. Your proposal doesn’t benefit from federation and doesn’t benefit from a UI optimised for discussing external links. You can of course continue using your real-ID account and interact with the Fediverse that way but no one else is doing real-ID so I don’t see that as a positive.
Aria@lemmygrad.mlto
United States | News & Politics@lemmy.ml•Liberals are catalysts to catastrophe, again
5·4 months agoHow was he allowed to publish this on AJ?
Aria@lemmygrad.mlto
Technology@lemmy.ml•Apple calls for changes to anti-monopoly laws and says it may stop shipping to the EU
5·4 months agodemands for interoperability with non-Apple products
the rules were not applied to Samsung
I wonder if Samsung owns a walled garden where their products aren’t interoperable with those from other manufacturers.
Aria@lemmygrad.mlto
United States | News & Politics@lemmy.ml•Jimmy Kimmel Pulled “Indefinitely” By ABC After Over Charlie Kirk Comments
11·4 months agoI expected him to have said something celebratory or at least making light of it. The only thing he said is that the ‘MAGA gang’ are capitalising on his death.
Aria@lemmygrad.mlto
United States | News & Politics@lemmy.ml•Utah governor says the motive in Kirk shooting is not yet certain but the suspect was on the left
5·4 months agodark corners of the internet. “Clearly there was a lot of gaming going on,”
deep, dark internet, the Reddit culture,Well this part seems trustworthy.
Aria@lemmygrad.mlto
Technology@lemmy.ml•China Reportedly Advances to 5nm AI Chips as Domestic Firms Tape Out Two New Solutions For Model Training & AI PC Workloads
2·4 months agoDon’t you have that backwards? Without TSMC’s outstanding technology, the island’s value decreases, both for China and for the USA. Conventional wisdom is that reduced tensions also reduces the risk of war.
Aria@lemmygrad.mlto
Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)
1·5 months ago300i https://www.bilibili.com/video/BV15NKJzVEuU/
M4 https://github.com/itsmostafa/inference-speed-tests
It’s comparable to an M4, maybe a single order of magnitude faster than a ~1000 euro 9960X, at most, not multiple. And if we’re considering the option of buying used, since this is a brand new product and less available in western markets, the CPU-only option with an EPYC and more RAM will probably be a better local LLM computer for the cost of 2 of these and a basic computer.
Aria@lemmygrad.mlto
Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)
3·5 months agoThat’s still faster than your expensive RGB XMP gamer RAM DDR5 CPU-only system, and you can depending on what you’re running saturate the buses independently, doubling the speed and matching a 5060 or there about. I disagree that you can categorise the speed as negating the capacity, as they’re different axis. You can run bigger models on this. Smaller models will run faster on a cheaper Nvidia. You aren’t getting 5080 performance and 6x the RAM for the same price, but I don’t think that’s a realistic ask either.
Aria@lemmygrad.mlto
Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)
1·5 months agoI’m not saying you can deploy these in place of Nvidia cards where the tooling is built with Nvidia in mind. I’m saying that if you’re writing code you can do machine learning projects without CUDA, including training.
Aria@lemmygrad.mlto
Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)
3·5 months agoI agree with your conclusion, but these are LPDDR4X, not DDR4 SDRAM. It’s significantly faster. No fans should also be seen as a positive, since they’re assuming the cards aren’t going to melt. It costs them very little to add visible active cooling to a 1000+ euro product.
Aria@lemmygrad.mlto
Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)
1·5 months agoYou can run llama.cpp on CPU. LLM inference doesn’t need any features only GPUs typically have, that’s why it’s possible to make even simpler NPUs that can still run the same models. GPUs just tend to be faster. If the GPU in question is not faster than an equally priced CPU, you should use the CPU (better OS support).
Edit: I looked at a bunch real-world prices and benchmarks, and read the manual from Huawei and my new conclusion is that this is the best product on the market if you want to run a model at modest speed that doesn’t fit in 32GB but does in 96GB. Running multiple in parallel seems to range from unsupported to working poorly, so you should only expect to use one.
Original rest of the comment, made with the assumption that this was slower than it is, but had better drivers:
The only benefit to this product over CPU is that you can slot multiple of them and they parallelise without needing to coordinate anything with the OS. It’s also a very linear cost increase as long as you have the PCIe lanes for it. For a home user with enough money for one or two of these, they would be much better served spending the money on a fast CPU and 256GB system RAM.If not AI, then what use case do you think this serves better?
Aria@lemmygrad.mlto
Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)
1·5 months agoCUDA is not equivalent to AI training. Nvida offers useful developer tools for using their hardware, but you don’t have to use them. You can train on any GPU or even CPU. The projects you’ve looked at (?) just chose to use CUDA because it was the best fit for what hardware they had on hand, and were able to tolerate the vendor lock-in.
Aria@lemmygrad.mlto
United States | News & Politics@lemmy.ml•Amtrak Rolls Out New High-Speed Trains Running Slower Than the Old Ones
1·5 months ago260 km/h is still good. As long as it’s significantly faster than driving, it’s deserving of some praise.
Aria@lemmygrad.mlto
Technology@lemmy.ml•China’s Domestic x86 CPU, the Zhaoxin KX-7000, Debuts in an AI PC by MAXHUB, Positioning It as a Viable Alternative to Intel/AMD Options
61·5 months agoDoes it have any sort of on-board NPU to make it AI-oriented?
Aria@lemmygrad.mlto
Technology@lemmy.ml•TikTokers are calling LA ICE raids 'music festivals' to trick the algorithm
4·5 months agoUnalive started being widely used 2020-2021.
Next time they’ll start with a false-flag and get those stats back on track.