

BSD would have been a much better fit for many reasons. It was just started with Linux for mostly irrelevant reasons, and then it was too hard to switch away.
BSD would have been a much better fit for many reasons. It was just started with Linux for mostly irrelevant reasons, and then it was too hard to switch away.
What dies that do? Just do noops heating up the CPU? How does it help?
I am making a physiotherapy game console for sick kids. I was in talks with an American company to develop it into a commercial product. Due to research grants being cut, the company can’t afford to pursue my project. So, I had to do some offline networking, and it turns out there are local companies here in Europe that are also interested in this concept.
Because Jesus didn’t compile the bible. That was done centuries after Christ. The Old Testament is mostly relevant for prophezising of Christ, so of course, Christ used the Old Testament to prove that he was the one that the prophecies refer to. It’s basically the spiritual back story.
Only if you take the deal.
They aren’t a technical bug, but an UX bug. Or would you claim that an LLM that outputs 100% non-factual hallucinations and no factual information at all is just as desirable as one that doesn’t do that?
Btw, LLMs don’t have any traditional code at all.
Yes, your fan art infringed on Blizzards copyright. Blizzard lets it slide, because there’s nothing to gain from it apart from a massive PR desaster.
Now if you sold your Arthas images on a large enough scale then Blizzard will clearly come after you. Copyright is not only about the damages occured by people not buying Blizzards stuff, but also the license fees they didn’t get from you.
That’s the real big difference: if Midjourney was a little hobby project of some guy in his basement that never saw the the light of day, there wouldn’t be a problem. But Midjourney is a for-profit tool with the express purpose of allowing people to make images without paying an artist and the way it does that is by using copyrighted works to do so.
It’s not anthropomorphizing, its how new terms are created.
Pretty much every new term ever draws on already existing terms.
A car is called car, because that term was first used for streetcars before that, and for passenger train cars before that, and before that it was used for cargo train cars and before that it was used for a charriot and originally it was used for a two-wheeled Celtic war chariot. Not a lot of modern cars have two wheels and a horse.
A plane is called a plane, because it’s short for airplane, which derives from aeroplane, which means the wing of an airplane and that term first denoted the shell casings of a beetle’s wings. And not a lot of modern planes are actually made of beetle wing shell casings.
You can do the same for almost all modern terms. Every term derives from a term that denotes something similar, often in another domain.
Same with AI hallucinations. Nobody with half an education would think that the cause, effect and expression of AI hallucinations is the same as for humans. OpenAI doesn’t feed ChatGTP hallucinogenics. It’s just a technical term that means something vaguely related to what the term originally meant for humans, same as “plane” and “beetle wing shell casing”.
In that case, woosh me. Just wanted to make sure nobody takes that as an actual advice.
With me too, my employer has to start worrying once I put my current position into my linkedin profile.
I agree with your final take, but why would you want to take frontend tickets if you can also do backend work?
Nope. Hallucinations are not a cool thing. They are a bug, not a feature. The term itself is also far from cool or positive. Or would you think it’s cool if humans have hallucinations?
Hallucinations mean something specific in the context of AI. It’s a technical term, same as “putting an app into a sandbox” doesn’t literally mean that you pour sand into your phone.
Human hallucinations and AI hallucinations are very different concepts caused by very different things.
590-620nm. Identical to orange.
The difference between brown and orange is the brightness level, and since the eyes have an automatic brightness adjustment, brightness levels only appear in context.
Light becomes a darker variant if there’s brighter light around and vice versa. Shine brown/orange light into a dark room, and it will appear orange. Shine the same light into a brighter context, and it will be brown.
It’s exactly the same thing as e.g. dark blue or light blue. Both share the exact same wavelength, and their brightness becomes apparent in context.
If you’ve ever been to a cinema and you saw anything brown or orange on screen, you have seen the effect. If you have ever seen a dim conventional light bulb in a bright room, you have seen it too.
Brown has just as much a wave length as orange, because it’s the same color.
Amazon games does it too.
Pokemon games also routinely come close.
That’s what you got emoji for.
One big problem is that pretty much all of these devices have major downsides. For example, I don’t know a single repairable or rugged phone with an actually really good camera or a flagship SOC.
They also usually have a huge markup and are often produced by small boutique manufacturers with terrible support (like Fairphone) and/or really bad software (like Fairphone).
So if you have the choice to e.g. pay €600 for a Fairphone with its terrible camera, battery life problems, inexistent support, huge amount of bugs and frequent issues with network providers (e.g. VoLTE not working), or you pay €300 for a comparable Samsung with similar software support duration (6 vs 10 years) and it just works without issues.
If there was something like a Samsung A56 or even a Samsung S25 that’s nicely repairable and costs a maximum of €100 more than the regular version, that might be worth it.
But the way it is now, it’s much better to buy a regular phone and spend the €300 you saved on 1-2 professional battery replacements down the line.