User Tools

Site Tools


ai:faq

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
ai:faq [2023/11/25 00:39] – formatting naptasticai:faq [2023/12/19 22:10] (current) – [1. WHAT MODEL DO I USE?!] ref controversy about HF leaderboard naptastic
Line 1: Line 1:
-=====Frequently Asked Questions about local hosting of LLMs===== +=====Frequently Asked Questions About Hosting Your Own AI===== 
-(maintainers: naptastic)+====1. WHAT MODEL DO I USE?!==== 
 +While this guide will attempt to point you in the right direction and save you some time finding a good model for you, **It is literally impossible to give a definitive answer.** There is no "best for", "best right now", "best among", or really any other kind of "best". It's "best" to let go of "best". ;-)
  
-====WE REALLY DON'T KNOW WHAT MODEL YOU SHOULD USE.==== +In order to pick model one must consider: 
-There is no way to answer this. In order to pick the "best" model one must consider: +  **Compatibility**: what formats you can use, 
-  * your use case+  **Resources**: depending on your situation, you might become limited by GPU speedVRAM, CPU speed, DRAM, hard disk space, or (less likely) bandwidth. 
-  * what resources you have available+  **Tradeoffs**: FastCheap, Good: choose at most two. ("Good" and "Easy" draw from the same well.) 
-  * what formats you can use+  **Surprises**: probably some other considerationsand **finally**, 
-  * what tradeoffs you're willing to make,+  - **your use case**.
  
-We can try to put together simple guide for "here's how you pick a good one"; naptastic is too tired to write that guide though. I just need people to understand, **it's not an answerable question**. I know it comes across as dismissive, but that's not how it's meant. It's honest.+This grumpy pile of text is gradually turning into a guide--hopefully not too misguided--for selecting models.
  
-===Aren't there at least comparisons?===+===How do I know if my model is compatible with my system?=== 
 +Probably not 100% guarantee, but... we can reduce the chances of a wasted download. 
 + 
 +The situation is really complicated but this is a FAQ so I'll keep it simple: 
 +  - **Read the model card.** If it doesn'have one, don't download it. The model card is also the most likely place to find reasons a model might not work for you. 
 +  - If you know the model will fit completely in VRAM, the best performance comes from GPTQ models. (2023-12; I haven't personally verified this.) 
 +  - If the model will not fit **completely** in VRAM, you **cannot** use GPTQ; use GGUF instead. 
 +    * GGUF comes in multiple quantization formats. You only need to download one. Use Q5_K_M if you're not sure. 
 + 
 +More details on the formats can be found [[ai:formats-faq|here]]. 
 + 
 +===Are there at least comparisons?===
 Sure. If you find a good one, send it to me and I'll add a link here. Sure. If you find a good one, send it to me and I'll add a link here.
  
-  * [[https://www.reddit.com/user/WolframRavenwolf/|u/WolframRavenwolf]] is the only Redditor I see posting comparisons of models. +  * [[https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard|HuggingFace leaderboard]] contains all kinds of scores you might care about. 
-  * [[https://nsfw-chatbot-rankings.web.app/#/|NSFW Chatbot Leaderboard]]+    * **NOTE**: There is currently (2023-12) controversy about how useful the leaderboard is. This has to do with model contamination. (TODO: add "contamination" to the glossary and maybe make a page about it) 
 +  * [[https://www.reddit.com/user/WolframRavenwolf/|u/WolframRavenwolf]] is the only Redditor I see posting in-depth comparisons of models. Their testing has a narrow focus and might not match your use case
 +  * [[https://nsfw-chatbot-rankings.web.app/#/|NSFW Chatbot Leaderboard]] exists
  
-===Can'I at least try one before I download it?===+===Can I at try one before I download it?===
 Yes. Nap does not know how. Please ask for edit permission and fill this section in. <3 Yes. Nap does not know how. Please ask for edit permission and fill this section in. <3
 +
 +===Are there any other shortcuts worth taking?===
 +I only know of one more: Use a model that someone else in your situation is already using, and they already know it works well. **I'd like to collect a few //(dozen)// such reports here**, if possible. Also what hardware, software, and speed you get, if possible.
 +
 +  * brain: Passes almost every AGI test given over a 40-year period. 90b tensors, 100t weights, runs on a completely proprietary stack. When it's thinking hard, it generates about 14-16 tokens/second. (It has almost been discovered [[https://www.reddit.com/r/totallynotrobots/comments/7ne308/comment/ds268zl/?utm_source=reddit&utm_medium=web2x&context=3|once]].)
 +
 =====Please read these few short paragraphs before diving into The Answers.====== =====Please read these few short paragraphs before diving into The Answers.======
 ===Philosophy=== ===Philosophy===
Line 44: Line 64:
 **Know your goals.** It is **critical** that you know what you want your AI to do for you. Even better if you have it written down. **Know your goals.** It is **critical** that you know what you want your AI to do for you. Even better if you have it written down.
  
-==What can'AI do?== +===What Things Can AI Do Right Now?=== 
-(At least not yet, or not well+  * LLMs generate text and code 
-    * Arithmeticincluding counting+    * They can integrate with... (fill this in plz) 
 +  * Diffusers generate images 
 +    * Upscaling 
 +    * Fill-in and fill-out 
 +    * Video 
 +    * (anything else?) 
 +  * Data format conversions 
 +    * OCR ("optical character recognition"which is just a fancy way of saying "image-to-text".) 
 +    * Speech-to-text (partially a classification problem; might be better served with other tools.) 
 +    * text-to-speech (though this might be better served by other tools) 
 +  * **lots of other stuff.** 
 + 
 +===What CAN'T AI do right now?=== 
 +  * LLMs are still pretty bad with math. 
 +  * Music generation is in its infancy. 
 +  * OCR for music transcription is still a hilariously impractical idea. 
 +  * **lots of other stuff.**
  
-==What's the best ____?== +===What kind of hardware should I buy?=== 
-This is really not an answerable questionTODO this needs a fuller/better explanation.+It depends(I know, I know... I hate that answer too but it's the truth.)
  
-==What kind of hardware do I need?== +Buying a CPU for inference is folly. The only advantage a CPU has is that it usually has more DRAM than the GPU has VRAM, so it can load larger models. The difference in inference speed is at least an order of magnitude. Choosing a GPU, the most important factor is how much VRAM it has.
-    * see [[formats-faq]] for now; this deserves its own page+
  
-==What software do I need?== +  * For maximum ease and speed, buy Nvidia GPUs. They are really expensive, though. 
-    Different software is useful for different goalsSee the Applications section below for more detailed information about each application.+  For a reduced cost, more headaches, and fewer applications that currently support it, buy AMDThey're still pretty expensive. 
 +  * Intel GPUs have the best price/VRAM ratio of the bunch, but there is almost no support. Getting them to work is (mostly) almost impossible even for experienced system administrators.
  
-==What do all these terms mean?==+===What do all these terms mean?===
 (nap definitely needs help with this) (nap definitely needs help with this)
     * need a glossary     * need a glossary
  
-==How do I do the thing?==+===How do I do the thing?===
   * Start with <nowiki>README.MD</nowiki> for the software you want to use. Seriously.   * Start with <nowiki>README.MD</nowiki> for the software you want to use. Seriously.
   * Links to how-to's   * Links to how-to's
  
-==How do I get help with the thing?==+===How do I get help with the thing?===
     * Read <nowiki>README.MD</nowiki> for the software you want to use again. Seriously.     * Read <nowiki>README.MD</nowiki> for the software you want to use again. Seriously.
     * Discord servers     * Discord servers
ai/faq.1700872787.txt.gz · Last modified: 2023/11/25 00:39 by naptastic