lemmy ui on desktop and mobile
e
lemmy ui on desktop and mobile
There’s also viscosity, adhesion
I checked the list of 370k english words I downloaded from github a while ago and yeah, its true other than the variants of homeowner (homeowners, homeownership)
I was looking at some other random words, heres some I found:
this got me interested so I wrote a program to find each time a small word bridges the gap between two larger words in a compound word, honestly the funnier part of its outputs is the weird ‘compound words’ its finding, like “asp: aspirating: as, pirating” or “at: deepseated: deepsea, ted” (ted, apparently, meaning ‘to scatter hay for drying’). Occasionally it finds good ones, like “ices: apprenticeship: apprentice, ship” or “hen: archenemy: arch, enemy”, and it did find the meow one. It does allow the small word to contain the first word in a compound word, because that can still give some interesting ones like “warp: warplanes: war, planes”. It probably would have been a lot better if I had actually used a list of compound words, it tries to find its own very slowly which does allow it to find any possible combination for any word
anyways, here’s the list
you don’t need shift right click to do either of those things on youtube, you can always right click on a thumbnail and get the normal menu, and if you right click twice on a video you get the normal menu
Yes, but 200 gb is probably already with 4 bit quantization, the weights in fp16 would be more like 800 gb IDK if its even possible to quantize more, if it is, you’re probably better of going with a smaller model anyways
Also worth noting that the 200 gb is for fp4, fp16 would be more like 800 gb
PCIe will probably be the bottleneck way before the number of GPUs is, if you’re planning on storing the model in ram. Probably better to get a high end server CPU.
I don’t have access to llama 3.1 405b but I can see that llama 3 70b takes up ~145 gb, so 405b would probably take 840 gigabytes, just to download the uncompressed fp16 (16 bits / weight) model. With 8 bit quantization it would probably take closer to 420 gb, and with 4 bit it would probably take closer to 210 gb. 4 bit quantization is really going to start harming the model outputs, and its still probably not going to fit in your RAM, let alone VRAM.
So yes, it is a crazy model. You’d probably need at least 3 or 4 a100s to have a good experience with it.
for that sort of thing you can look it up in an private window, where at least google will have to pretend to not be tracking you
If everybody else is doing the same thing, yeah.
I was talking more about whether the existence of an image AI, regardless of the images it generates, breaks copyright law because of how it was trained on copyrighted images
as someone who never really experienced the old internet, all I hear are the positive sides and I feel like the guy in xkcd 239
Unions would probably work, as long as you get some people the company doesn’t want to replace in there too
Maybe also federal regulations, although would probably just slow it because models are being made all around the world, including places like Russia and China that the US and EU don’t have legal influence over
Also, it might be just me, but it feels like generative AI progress has really slowed, it almost feels like we’re approaching the point where we’ve squeezed the most out of the hardware we have and now we just have to wait for the hardware to get better
Well, current law is not written with AI in mind, so what current law says about the legality of AI doesn’t reflect its morality or how we should regulate it in the future
Chromebooks are insanely locked down at schools. I got one on eBay for $40, installed linux, and now it can play Minecraft Java at 60 fps so that’s something.
I wouldn’t say ‘only’. There were a lot of downvoted things that were just controversial.
Often there are multiple ways to interpret a poster’s intentions, and if you see a heavily downvoted comment you will automatically assume the worst.
that would look fine with nearest-neighbor probably, I looked it up and there’s a thing in css for that
www.xkcd.com/795/