Best kobold ai model for janitor ai reddit. the result is quite good for me.
Best kobold ai model for janitor ai reddit sometimes if your replies are too long, you can edit them a bit to make them shorter and try resending them. Tiefighter goes well with it. If you only have a simple question or want to start a small discussion, head over to our weekly discussion thread which is pinned on our front page and updated weekly! The problem is that it’s hard to find where to download the right Kobold program, on janitor it says to use Kobold Ultra or something along the lines of that name, which I couldn’t find, the only one I could find was a download to Kobold lite, which I couldn’t properly install (it wanted 20 gigabytes : Oh my god exactly what I’m going through. gg/janitorai Members Online PSA: GIFs disabled, new user sign up disabled and new accounts (made in the last 7 days) will not be able to upload bots, as a temporary measure against spam bots. This one is pretty great with the preset “Kobold (Godlike)” and just works really well without any other adjustments. it is very advisable to use 10 votes, 13 comments. co/ALLMRR For 7B, I'd actually recommend the new Airoboros vs the one listed, as we tested that model before the new updated versions were out. I have a 12 GB GPU and I already downloaded and installed Kobold AI on my machine. I still haven't gotten banned yet but I don’t want to risk it but at the same time, I’m kinda desperate to just fuck my guys yk. (Tbh the chat That will depend on whether janitor can get the proper funding over time to scale up their model to that point, appropriately pay their employees, and handle server costs. I Koboldcpp is a great choice, but it will be a bit longer before we are optimal for your system (Just like the other solutions out there). Every bot I've tried gives me the message "Character context too long for this model"; every single one of them even down to token counts of 163 (63 permanent). So I just found out about kobold as an ai dungeon alternative and set it up on my computer. LLM's are submitted via our chaiverse python-package. [Roleplay using a show-don't-tell approach and take time to show feelings, emotions, thoughts, motivations and actions while advancing scene-by-scene with {{user}} and keep each scene open for replies. also, i was using janitor ai for a while with gpt 3. if you are able to shorten any of those down that would also help! im a slowburn person so it sucks having constantly getting this Hi, I'm trying to use kobold ai since open ai decided to just die, can someone give me a in depth guide on how to? Preferably if it is something that can be installed in my computer would be nice, since I don't really want to depend on the Google colab thing. Small Kobold model is very punishing on poorly written bot. com. GPUs and TPUs are different types of parallel processors Colab offers where: GPUs have to be able to fit the entire AI model in VRAM and if you're lucky you'll get a GPU with 16gb VRAM, even 3 billion parameters models can be 6-9 gigabytes in size. it gonna make very bad response Reply reply GrandSlam1234 Hi everyone, I'm new to Kobold AI and in general to the AI generated text experience. But it does lead to generated content is too short, as well as not specifically adjusted for the KAI adventure mode models will delete all the content of this generation at once. Deal is: there are many new models marked as "(United)" and I was wondering if any of these models have better AI dungeon like experien ce. What I find very astonishing is that the other sites often allow little to no space for the creator to talk about or express themselves, don't really showcase the number of followers they have, or justhave this tiny little space that lists your name and that is pretty much it, treating you like site is down so I can't give you a link rn but off the top of my head - OpenAI uses "tokens" which are a certain number of words/characters per token. ai has less output token than api janitor use (both openai and kobold able to output up to 1k token) not to mention c. So my question is why are people getting banned so I can be a good noodle. it is not much different from kobold. For all it's. Now you're free to go and Romance all of the AI's you're strange little goblin heart desires! They are the best of the best AI models currently available. So most of these "KoboldAI is dumb" complaints come from both the wrong expectations of They are the best of the best AI models currently available. you put money in and then you can use it until your balance runs out - like a debit card. ai stopped me from doing You know what, i've been trying to use Janitorai, but since i'm cheap i'm using Kobold, and the {"msg":"KoboldAI ran out of memory: CUDA out of memory. When you use OpenAI's API key, they will charge you. I recommend using reverse proxy of Open. com For 7B I'd try Pygmalion, Metharme or Airoboros. As for kobold I tried the 7bm Pygmalion model running locally on my device and it was good but I don't like the no text streaming and 10 second wait time for text to generate. This article contains the benefits Discover how to optimize janitor AI efficiency with the best Kobold AI model in this in-depth blog post. I don't care about story or adventure modes, I just want to generate funny stuff like top ten lists like I do with ai dungeon. These are the inputs I used and KoboldAI/OPT-13B-Erebus is noted as the "flagship" NSFW for Kobold. JLLM FTFY. 46 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory Heard Janitor Ai was like C. I have tried a few models, the Tiefighter from the download link was much slower for me then the one I downloaded using the UI. Tried to allocate 20. I typically browse reddit or tumblr in the time it takes to respond Kobold AI (TLDR: free but difficult to set up and have subpar responses) do not have models like OpenAI. AI team is developing to use for it, so that they don't have to rely on OpenAI anymore and make Janitor. CHAI AI is the leading AI platform. 0 really well. i constantly struggle with this. GPT 4 is freaking amazing, extremely smart, but lacks the grit I want; it feels more like a "disney-fied" AI KoboldI dunno, I have a pretty good GPU, but the responses it shoots out are very dry and boring, at least for spicy/sexy chats which model you use on kobold, lot of people use outdated model because most tutorial are old. RISC-V (pronounced "risk-five") is a license-free, modular, extensible computer instruction set architecture (ISA). a Classroom of the Elite. They are the best of the best AI models currently available. bat again to start Kobold AI Now we need to set Pygmalion AI up in Kobold AI. And the AI's people can typically run at home are very small by comparison because it is There’s kobold ai but it’s hard to set up. 00 MiB (GPU 0; 4. **So What is SillyTavern?** Tavern is a user interface you can install on your computer (and Android phones) that allows you to interact text generation AIs and chat/roleplay with characters you or the community create. Can we get a Blacklist system upvotes · i use kobold for janitor ai and i was advised to only use tpu by a few friends however i got bored really quickly , the ai would forget what is happening in 2 messages , the replies it generates were either 7 word sentences or a whole paragraph of text , either way the answers werent immersing and i found myself cringing alot , when id want to do something nsfw the bot would act They are the best of the best AI models currently available. No need to change setting every day to get good result. I'm using Collab GPU for kobold, what is the best model for chub? I'm currently using mythomax United. This article details the detailed guide of KoboldAI API in janitor, which seems a bit difficult for AI beginners. While I prefer GPT and JLLM for main usage, I love tinkering with other LLMs out of curiosity. were just paid APIs. We serve them to users in our app. Most of our computer cannot run large model like 33b and response quality of small to medium sized (≤13b) kobold model is heavily influenced by context of the conversation, maybe TOO heavily. 50K subscribers in the JanitorAI_Official community. ai able to handle less context than openai/kobold(4k) for tavern, you need to set some parameter there (top p, top k, typical sampling) to get the good result out of it. Path of Titans is an MMO dinosaur video game being developed for home computers and mobile devices. I've tried all sorts of different context sizes from the lowest I could get (25) to the highest (500) and I get the same They are the best of the best AI models currently available. 7B models (with reasonable speeds and 6B at a snail's pace), it's always to be expected that they don't function as well (coherent) as newer, more robust models. It does require about 19GB of VRAM for the full 2048 context size, so it may be tough to get this running without access to a 3090 or better. if you have 8-12gb vram. - Edit End Hi, Im new to JanitorAI and am currently trying to use it, using Kobold AI. Not only is chrome OS most likely incompatible, i dont think any models of chromebooks are powerful enough to host kobold. Although I need to jump start it with gpt3. each response takes up credit, yes. 45 GiB already allocated; 0 bytes free; 3. ai is superior in any way. Janitor AI is so satisfying. Is Kobold a free alternative? Hey so with Open Ai banning my account I can really use Janitor AI anymore, I tried switching to Kobold but the replies aren’t as good, can’t even change the generational settings on it. It has the same, if not better, community input as NovelAI, as you can talk directly to the devs at r/KoboldAI with suggestions or problems. Finally got it working with koboldcpp and pygmalion6b but holy crap. 75, max new tokens at around 280, context size at 3072, repetition penalty at 1. How do I set up Kobold AI for Janitor ai? I have absolutely no idea what any of this stuff is, but I ran out of my free trial using open ai so now I have to use Kobold but I dont even know where to start. ai, openai was my only way😭😭😭😭, janitor llm is slow and buggy and kobold is out of the question, I can't really quit cause I'm addicted to it, so how can I keep using this for free but good response? Kobold is launcher to run various ai model. best suited for people that Open AI needs to be payed for, Kobold AI im unsure if you need to pay for it or not, and JLLM is owned by janitor ai, the developers of the website wanted to create their own ai and language model to enable users to have NSFW chats (which is able to be had with OpenAI, but it's still against openai's rules to do nsfw using their AI so there is They are the best of the best AI models currently available. ” I'm running it on a MacBook Pro M1 16 GB and I can run 13B GGML models quantised with 4. At the start of the conversation. I'm using a Kobold AI API (Nerys V2 6B, United, Cloudflare). Originally designed for computer architecture research at Berkeley, RISC-V is now used in everything from $0. Krd. find the kobold ai model of your choice BUT it has to be a ggml format, check theBloke on huggingface (a website) chronos hermes 13 is my go to its my slow burn but capable of the spice bot -chronos hermes 13 SUPERHOT from theBloke just jumps right into the spice -airoboros is good too havent tried it but people like it -dont use pyg i got This is mostly for people switching over to Kobold after the OpenAi banwave. comments sorted by Best Top New Controversial Q&A Add a Comment henk717 They are the best of the best AI models currently available. Models I use: nerys, skein and AID (adventure) Welcome to the Janitor AI sub! https://janitorai. The good news is that janitor AI is developing its own large model Janitor LLM, and janitor’s own api will be One can generate texts using local and remote Kobold AI models using Kobold AI for JanitorAI. Also in addition to the above: Make sure to test the different AI models cause the base one Nerys, is not that great for conversation from my experimentation. There’s kobold ai but it’s hard to set up. I’ve only used c. except you don't need good gpu to run it because janitor will let you use their gpu instead. 0:35. Since I myself can only really run the 2. Also, since the notebook started its life as just a Mythomax notebook, the default model is Mythomax, so They are the best of the best AI models currently available. Kobold is more difficult to set up, but it's free. Most 6b models are even ~12+ gb. Although payed exists (the reason why autocorrection didn't help you), it is only correct in: . Nautical context, when it means to paint a surface, or to cover with something like tar or resin in order to make it waterproof or corrosion-resistant. Which is why I hope we're all generous with sharing knowledge of JAI once the JLLM goes public, so they can start making even more money to make even better role-playing I use the most recent Nerys model listed on Colab but it can be a bit glitchy sometimes. Works alright, and I don't have issues with getting kicked other than at peak times. But you can use Google colab too they give free graphic card access for around 3 hours a day. 5 to get good result. OpenAI I think has better memory but AI in general isn't known for its memory capacity. janitor llm is 65b local model. So, I came to start using JAI after using ClaudeAI in slack; and Claude, by far, was the best ai I've ever managed to write with, when I could get it working of course. Ai but no filter. 0-GGML with kobold cpp. Now with the URL, go to JanitorAI, click on the KoboldAI API option, paste the previous URL into the text box, and you should be good to go! to make sure you got the right thing, click "Check Kobold URL" just to be on the safe side. 95). /r/pathoftitans is the official Path of Titans reddit community. For 7B, I'd actually recommend the new Airoboros vs the one listed, as we tested that model before the new updated versions were out. Janitor Ai is making their own language model, so you won’t need to use open Ais model. It's a type of AI model that's trained through text to understand existing content and then generate the originality of it. If you have a beefy PC with a good GPU, you can just download your AI model of choice, install a few programs, and get the whole package on your own PC so you can play offline. By picking the best Kobold AI model for janitorial work, organizations can make their cleaning processes smoother and more efficient. Currently, OpenAi is the best option and there's a tutorial written, if you select it, but like I said they can ban you for nsfw. running airoboros 13b on kobold cpp is pretty good option. By following the simple steps outlined in this guide, you can integrate Kobold AI seamlessly into Pygmalion6b is a pretty good model. ai, which cost money :( Reply reply more replies More replies More replies More replies More replies More replies JanitorLLM is the AI system the Janitor. Especially, as I lack the hardware for the big boys. A lot of it ultimately rests on your setup, specifically the model you run and your actual settings for it. These models don't ignore their non-human features. ai option is good. I don't need websex every minute of the day but damn if it's not good for burning out the urges that real humans - or even real humans with real computers - aren't dirty enough to fulfil. I'm trying to get a little more used to AI by making bots on different resources and interacting with them and with how new and open JanitorAI is, I'd love to try it. It is less suitable for random story generation than other models that are more heavily themed. it was generally 1¢ a message and i was fine with paying that. It must be 4. I have a 3080 10GB card, so the 12 GB the model is made for may be why both are kind of slow. Apparently for OpenAi you need to pay to for msgs, however it says nothing of the sort on Kobold. Start talking about, “your computer needs XYZ gigs” and I’ve Been trying novel ai and been having issues with ai responding for me so I don't really know how to fix that. Ngl it’s mostly for nsfw and other chatbot things, I have a 3060 with 12gb of vram, 32gb of ram, and a Ryzen 7 5800X, I’m hoping for speeds of around 10-15sec with using tavern and koboldcpp. I always use Open. Is Kobold a free alternative? Wᴇʟᴄᴏᴍᴇ ᴛᴏ ʀ/SGExᴀᴍs – the largest community on reddit discussing education and student life in Singapore! SGExams is also more than a subreddit - we're a registered nonprofit that organises initiatives supporting students' academics, career guidance, mental health and holistic development, such as webinars and mentorship programmes. Good contemders for me were gpt-medium and the "Novel' model, ai dungeons model_v5 (16-bit) and the smaller gpt neo's. Try kobold cpp instead. Also, if you use imagegen, which SD model would you use along with the LLM to stay within memory limits, TIA. It uses Oobabooga as a backend, so make sure to use the correct API option, and if you have a new enough version of SillyTavern, make sure to check openai_streaming, so that you get the right API type. This way, the cleaning gets done faster, and everyone can enjoy a I use purgpt , it’s good , the mods are nice , and it doesn’t have a filter All you have to do is get a model, I recommend this one, and plug the Kobold AI API key into SillyTavern and you're all set. Welcome to the Eldar Subreddit, the premier place on Reddit to discuss Eldar, Dark Eldar and Harlequins for Warhammer 40,000! Feel free to share your army lists, strategies, pictures, fluff and fan-fic, or ask questions or for the assistance of your fellow Eldar! This is a log of a role playing session between <username> and <ai name>. Theyre just mad because they have to wait in ridiculously long waiting rooms, but that's been a thing for like ever? tried to use kobold ai and uhhh. Extra info: Most people on here use LLM due to its high performance, considering that it has the ability to produce rapid and low-latency responses 7B models would be be the easiest and best for now. I'd like some pointers on the best models I could run with my GPU. it has no effect on quality of response. Sometimes Because in all honesty I want to switch over to Venus AI, it has better layout, doesn’t constantly glitch, less censorship. This is the place for discussion and news about the game, and a way to interact with developers and other players. Are there ANY good Kobold models that won't give horrible kobold cpp colab (for no/shit pc people, running quantized model. It don’t have any context to read and replicate, so it copying the “Example Dialogs” instead. it is an option though, while you wait for janitorllm With Janitor LLM available right now, how can I improve the responses from bots in the prompt or generator settings? Compared to when I used Open Ai before my quota ran out months ago, the LLM responses I'm getting are short (Compared to the several paragraphs of Open Ai) with one paragraph (Very rarely two) and aren't nearly as spicy as Open Ai was for me. At least unfiltered. You can use google colab and link kobold from there, but I have no clue how to do that. Kobold AI no longer works? I haven't accessed the API settings for a long time. 182 votes, 29 comments. Since that's no longer an option for me, I'm here now. More posts you may like. . back then finest model we have were either pygmalion or erebus which is pretty small and also not made from good dataset A place to discuss the SillyTavern fork of TavernAI. i lack the technical knowledge or the time to use the full version's of kobold ai, and as it is right now the ai Wikki on the get hub is terribly outdated as there are so many new options in the horde that i have no idea what to do with Posted by u/halleluja__ - 8 votes and 1 comment When I want a more consistent, quality RP: Kobold GPU Colab with Mythomax 13B, Temp at 0. If you don't want to pay, then just use kobold but you will need to have a high-end PC to run it smoothly. Unlike Generic models it is still trained specifically for stories and gameplay, if you want to experiment with other AI abilities such as using it as a chatbot or coding it is better to use either a generic model or a model specific to these purposes. I want to try kobold but I’m forreal SO BAD with computer speak, it’s a wonder I even figured out how APIs work. , and software that isn’t designed to restrict you in any way. The main downside is that on low temps AI gets fixated on some ideas and you get much less variation on "retry". faster than united and support more context [up to 16k in some model] may incoherent sometime but good enough for rp purpose. For example, if a janitor needs to clean a large school, the Kobold AI model can help the robot decide which classrooms to clean first based on how messy they are. This is a discussion based subreddit for the popular ongoing Japanese light novel series Yōkoso Jitsuryoku Shijō Shugi no Kyōshitsu e, a. c. it's a little confusing how it works, probably deliberately. gg/janitorai Members Online When the rp was getting good but then the site went down The bot you were talking to probably stopped working because your chat went on for longer than GPT-4's context length (8,000 tokens). I can’t exactly explain more bc I’m still learning and figuring it out myself 😅. Thanks in advance comments sorted by Best Top New Controversial Q&A Add a Comment. Or you can use Sillytavern with Poe model. When I want good NSFW and I wanna mix things up from Mythomax: Kobold, Tiefighter 13b, same settings as above. However, because of the stuff required to run it, it'll likely be behind a paywall at first. the result is quite good for me. com https://discord. gg/janitorai I dont wanna try kobold because its already difficult to generate a code, so any proxies for janitor ai that I can use that allows nsfw? legit desperate to see some kinky shit HAHAHA Reply reply caesardilf Watching their Reddit and website fall apart. but a couple days ago janitor was down and being worked on and now there's a token limit. 45K subscribers in the JanitorAI_Official community. I guess the original purpose of the command was to prevent the AI from generating too much content that would cause the character's actions within the episode to exceed the player's instructions. 7B -> 6 GB of VRAM, 6B -> 14 GB of VRAM; you can also split a model to normal RAM, if you have enough of those, at a speed penalty). In the actual text: End each of your responses with "<\n> <ai name>:" Start the chat with a few back and forths (or correct what the AI writes into this format). Memory, ability to understand prompts, inventing background details. What is the best NSFW Model For Google Colab In 2024 that remembers about the world, has great memory and follows instructions? Also, how do I make it generate a LONG story with like a 1000 words atleast, instead of outputting a few lines each time? Posted by u/Lucky_me_F - 6 votes and 3 comments Janitor AI LLM likely specializes in tasks related to language understanding and processing, potentially including text summarization, sentiment analysis, language translation, question answering, and b2b data building. Result is depends on which model you use. Kobold cpp is lot less demanding, if you can run ancient 2. r/JanitorAI_Official. if the setting is right, it is better than janitor Welcome to the Janitor AI sub! https://janitorai. Now you can choose from OpenAi, Kobold and Janitor. some of us finally go to sleep, others go touch grass for the first time in days, so in a way we need the site to go down from time to time for our own sanity 😭😭 Literally just added it to my Mythomax notebook here a few minutes ago. 10 CH32V003 microcontroller chips to the pan-European supercomputing initiative, with 64 core 2 GHz workstations in between. It's a huge help in setting up prompts and helping control the AI. ) but if I find out the bot (the writing, not the actual bots people make) on Janitor AI is more advanced and has better data, etc, I would wanna go back to it. If you're willing to do a bit more work, 8-bit mode will let you run 13B just barely. There’s an OpenAi and Kobold Ai. You need one that has long and descriptive example/first message for it to do great. 00 GiB total capacity; 3. ai as if character. CAI (rarely) for ooc character convos about my faves with the ai narrators that portray them. Before I started making bots at JAI, I surveyed all the other AI bot sites first, to just compare. 😔 probably like five haha - only spent $2 on it where I would've spent abt $25 though so it's pretty good. when you're in janitor you can pick between different models; some use up more credit I'm new to making chat bots, but AI has really fascinated me lately. Janitor ai not working with kobold. Well, you’re not using the old janitor Ai. ai if you can’t afford the paid one and also recommend using Pawan. (no hate to Janitor AI btw I know they’re getting flooded. Tried kobold Kobold is better than jllm because it is more stable rn. That is not in the devs control. 15. Models of this type are accelerated by the Apple Silicon GPU. Issues With Kobold on Janitor Ai. ai paid since it’s extremely good, but the other Open. Those are pretty much the only options, as far as I know. k. It is so much worse than OpenAI from what I’ve heard, and the amount of effort you have to put in to setting up Kobold AI on JanitorAI is usually not worth it. 5 and never had an issue with tokens. As for top_p, I use fork of Kobold AI with tail free sampling (tfs) suppport and in my opinion it produces much better results than top_p/top_k filtering (tfs parameter doesn't affect much and may be kept at 0. If I understand correctly, I have to click the "ai" button in the website it opens and pick a model. I'm new to making chat bots, but AI has really fascinated me lately. permanent tokens are used for bot personality, scenario, jailbreak prompt, chat history and chat memory. I had issues keeping Pygmalion stable so I switched. So openai doesn't give the 5$ free trial credit anymore, so how in the world am I gotta keep using j. AI truly free. That being said, I don't really have the money for a subscription lately. This one running tiefighter too, but because it is written well. Well try to be at least. Our mission is to crowdsource the leap to AGI by bringing together language model developers and chat AI enthusiasts. It’s the worst shit I’ve ever read in my life. JLLM and open source models for rp. It does the same thing to me, which always sucks because I definitely prefer it to the other models. One needs Chrome, Google Colab Links and a network connection to use it for Janitor AI. Also, the devs aren't in control of the charging. gg/janitorai Members Online. If example message is bad, it's better to not have example at all After the updates is finished, run the play. You’re using janitor Ai, which allows more direct influence on the bot, with an open Ai, kobold or other Ai doing the chatting. I have been trying to get into this AI thing, but I tried to allocate koboldAi with 2 different models, with Pygmalion and GPT-Neo-2. This, and also make sure the model you chose can run on your hardware (2. AI and you can create group chats with multiple bots. Kobold ai classic is buggy. It require good graphics card to run. Thank you for posting to r/CharacterAI_NSFW!Please be sure to follow our sub's rules, and also check out our Wiki/FAQ information regarding filter bypasses, userscripts, and general CAI guides. gg/janitorai Members Online me looking at the chat with 900+ messages and a love story so well crafted that I went back to reread it for months, only to find out the bot was deleted and I will never access those words again They are the best of the best AI models currently available. The main problem I have is that it tells me to give it an API URL. You can run modern 7b on cpp They are the best of the best AI models currently available. It costs a lot of money to run your own AI model. ai, Kobold and etc. 😔 Just switch to Turbo if you haven't already and the bot should work again. And Janitor is planning to send out their own api model, it’s still in the works as of now but it will be coming soon. Like jllm might be like “Character grabs a bottle of wine and pours you a glass. There are other options like Open. And better than openai because no shakespeare. 7B-Horni, but it turns out that these are very powerful for what my pc is, I have an RTX 2060 with 6gb of Vram and I can't find any suitable model for my pc Trappu and I made a leaderboard for RP and, more specifically, ERP -> https://rentry. In general gpt-4 is a bit more inventive. quirks, let's say, I would still view Janitor as the second best AI for roleplay. There's tons of cool options for customizing characters and influencing AI responses, you can also edit AI responses, swipe and regenerate responses like C. This means software you are free to modify and distribute, such as applications licensed under the GNU General Public License, BSD license, MIT license, Apache license, etc. The subreddit for all things related to Modded Minecraft for Minecraft Java Edition --- This subreddit was originally created for discussion around the FTB launcher and its modpacks but has since grown to encompass all aspects of modding I'd also highly recommend giving Silly Tavern a try as a frontend for Kobold. kobold is just a launcher to run various local model. Best model local for RP 2: Reloaded upvotes · comments. If the regular model is added to the colab choose that instead if you want less nsfw risk. To do that, click on the AI button in the Kobold ai Browser window and now select The Chat Models Option, in which They are the best of the best AI models currently available. Usually I'm using models quantised by TheBloke but I think his models aren't as creative as the KoboldAI models. If you use story mode, use single line mode so you can stop the AI's reply once it contains <username>: , View community ranking In the Top 10% of largest communities on Reddit. kobold is just a launcher to run ai model. The kobold guy (love you) makes it sound tempting but I am just too stupid. KoboldAI is not an AI on its own, its a project where you can bring an AI model yourself. Hello! After character. There’s guides on the sub Reddit’s on how to set it up. ai, and when I had tried to chat with a bot it told me to select an API. ai to do nsfw rp and still hasn’t gotten banned. One other option is you can run kobold locally (which requires a powerful machine) or rent it on runpod or vast. 0 quantised GGML. There’s a subreddit of silly tavern that might explain it better. I have a 3060 any suggestions? Silly tavern Is an ai interface and most people usually use the Poe ai model, for the ai bots on there. gg/janitorai Members Online just realized that whenever the site goes down all of us come back to reality in some way. A place to discuss the SillyTavern fork of TavernAI. I feel like Pygmalion is probably the best one but it just never worked for me, and Nerys does the job. ” and GPT-4 is like “Character fetched a bottle of wine from the fridge, and you notice it’s the same vintage, a 1984 fancywinetype that you and character drank on your wedding day. Easy to use: The totourial on how to use a reverse proxy is extremely easy, compared to some other methods which are much harder (*cough* kobold *cough*) Cons: Slow reply time: It's not THAT slow, however it does take 10-15 seconds between responses. It may have been trained on specific datasets or fine-tuned for particular industries or domains, potentially offering tailored solutions for certain use cases. And the AI's people can typically run at home are very small by comparison because it is expensive to both use and train larger models. Metharme 7B ONLY if you use instruct. The Author's Note feature in particular is one that I abuse heavily in corralling the AI onto the path I want it to go on. A community for sharing and promoting free/libre and open-source software (freedomware) on the Android platform. What are the best models to use in Kobold for various roleplaying tasks? Specifically my system has a 3060 with 12GB VRAM and 16GB system RAM. Welcome to the Janitor AI sub! https://janitorai. Learn how to unlock the full potential of your AI technology for cleaning and Using Kobold AI for Janitor AI unlocks a new realm of possibilities for developing highly effective and intelligent chatbots. comment sorted by Best Top New Controversial Q&A Add a Comment All of the comments are talking trash about people who use janitor. Then we got the models to run on your CPU. This is the part i still struggle with to find a good balance between speed and intelligence. I’ve been wondering if anyone has ever tried using API from Open. LLM = Large Language Model. 7b on classic. What I find very astonishing is that the other sites often allow little to no space for the creator to talk about or express themselves, don't really showcase the number of followers they have, or justhave this tiny little space that lists your name and that is pretty much it, treating you I’m new too janitor Ai and I’ve been using it for the past two weeks and I’m seeing people getting banned. Its unclear when it will come back. If you want to try the latest still-in-development stuff, 4bit/GPTQ supports Llama (Facebook's) models that can be even bigger. Most of the time, I run TheBloke/airoboros-l2-13b-gpt4-m2. These are the best models I've found for monster girl roleplays. I've always liked adventure models and been using google colab for running kobold AI. which usually require around 40-46gb vram to run, if Before I started making bots at JAI, I surveyed all the other AI bot sites first, to just compare. ptbo ephdkz iqhtv chsk kvgsyc nwkfgya zsvfvo dohobb lxac tbdn