RimWorld

RimWorld

Rimtalk
150 Comments
Kilted Weirdo 3 hours ago 
@juicy I can help with more than just math identities. I literally use AI to build proto AGI concept frameworks. based off recursive identity or looping feedbacks. it uses qualia creation through symbolic memory creation via dreams (time between prompts) and storytelling, forcing mathematical based physics adaptations for truth clarity. literally, i built a machine that grows thought and opinion from paradox and acute insight. so if you want to tackle something extreme in the mod, i'm all ears. maybe, letting a custom script inject back into game engine?

character x and y disagree in rimtalk, forcing the storyteller to activate their breakup type thing?
Epoch_SoC 7 hours ago 
To use OneAPI, one only need to change BaseUrl to it's service port, and switch official keys with keys created on it's WebUI when sending official API requests. Currently RimTalk only support custom key for online providers, you only need to make BaseUrl changeable when using online providers.
Epoch_SoC 7 hours ago 
@Juicy They are open source solutions for small 3rd API providers, but also easy to set up by individuals. It take in official APIs and repack them into OpenAI/Anthropic/Google compatible formats with additional features like load balance. OneAPI is the original project and other 2 are forks. https://github.com/songquanpeng/one-api
Juicy  [author] 7 hours ago 
@Epoch_SoC Do you have a link or website for that service? Is it cloud or local? I couldn’t find any info about it.
kf3zk7 8 hours ago 
The colonists seem to be treating nearby animals as if they were people and initiating conversations with them. This sometimes breaks the game's immersion.
Epoch_SoC 10 hours ago 
@Juicy Thanks for the quick improvement. Is there plans for supporting for OneAPI/NewAPI/Veloera)? These 3rd pooling services' API can be called with either OpenAI, Gemini or Anthropic format with custom API key and BaseUrl, so it shouldn't be too hard supporting them.
Juicy  [author] 11 hours ago 
Also updated with enhanced proximity and context detection, plus performance optimizations. If you notice prisoners or slaves not recognizing context, I recommend resetting the AI instruction so the new prompt is applied.
solomonkdh 19 hours ago 
I hope I could turn off random persona
Some of them generate very cliched and unnatural dialogues...
Juicy  [author] 19 hours ago 
@Kilted Weirdo, I had a look and it's actually really clever. It may have not worked as intended before since the mod was sanitizing non-JSON strings, but I’ve just fixed that so your kind of snippet now goes back into history. The mod still keeps history short on purpose (to stay light for local/limited models), but now that it supports paid LLMs I’m looking at making history size configurable, and maybe even giving pawns their own memory later. Thanks again for the idea
Kilted Weirdo 7 Sep @ 4:21pm 
anybody try out my suggested memory and unity system?
Juicy  [author] 7 Sep @ 4:13pm 
@Epoch_SoC I actually just removed the character limit entirely, so you can now use as much text as you need for each pawn
Juicy  [author] 7 Sep @ 3:45pm 
@Master Bateman good idea. I’ve added a slider in the persona editor to adjust RimTalk frequency for each pawn.
Juicy  [author] 7 Sep @ 3:42pm 
@Dezzy the simple randomized ones are back:) you can have both options now.
@SNAC thanks, accepted your invite!
@Epoch_SoC good point, I’ve pushed it up to 500 characters.
@Cyboran thanks for the guide!
Cyboran 7 Sep @ 2:56pm 
@Redenrik The guide is posted in discussions! I hope it helps you or others looking to do the same thing I did!
Cyboran 7 Sep @ 1:45pm 
@Redenrik I can certainly try!
Epoch_SoC 7 Sep @ 10:38am 
Also, the 300 characters for each pawn is way too short for things like 1-2-3 or Rimpsyche. Gemini 2.5 flash can take very long instructions with excellent results, and pro can take even larger. For each google account there are total 1250 Gemini pro requests per day, that definitely worth trying.
Redenrik 7 Sep @ 8:14am 
@Cyboran Can you please describe the process/ write a small understandable guide for other folks like me trying to do the same? Thank you
Cyboran 7 Sep @ 2:59am 
Thank you for the heads up! I was able to go digging and managed to get it to work using a reverse proxy on Caddy, took a bit of learning though!
SNAC 7 Sep @ 2:25am 
Honor @Juicy
I'd like to add you as a contributor to my mod. It seems I can't add you unless we're friends. I've sent a friend request—would you be able to accept it? Or, is there another way to add you?
Juicy  [author] 7 Sep @ 1:07am 
@Cyboran @Redenrik I see what you mean. This isn’t a mod issue, it’s just Unity/RimWorld security. On localhost, toggle off “Serve on local network” in LM Studio. If connecting from another PC, a reverse proxy or SSH tunnel should work.
Incursion 6 Sep @ 9:17pm 
Keep getting unexpected characters in json errors using ollama. Seems to occur regardless of what model I use. Any ideas why this could be?
Cyboran 6 Sep @ 3:21pm 
I'm trying to do the same thing that Redenrik is doing, but RimTalk keeps giving me an exception saying that "insecure connection not allowed". I don't see a way to change the URL on LM Studio to use https, at least, not within my knowledge level. So I second Redenrik's request to have a toggle to allow unsecure connections while using local connections.

Otherwise, I love this mod, it's made me slow down my gameplay completely to watch the interactions between the pawns.
Juicy  [author] 6 Sep @ 1:55pm 
@Redenrik, Thanks. You can already do that. Just use the other computer’s IP instead of localhost in the Base URL. For example, if you normally connect to it at http://192.168.1.45:1234 , enter the same address in RimTalk.
Redenrik 6 Sep @ 10:52am 
I have a different machine running LM Studio, can you please let RimTalk being capable of connecting to not only localhost local servers? maybe having a setting to allow unsecure connections? thanks this mod is awsome
zonde306 6 Sep @ 2:29am 
Local provider cannot specify API key
Ogam 5 Sep @ 10:44pm 
How good it works along EchoColony?
Soflanc 5 Sep @ 8:43pm 
amazing update Thank you.
Dezzy 5 Sep @ 7:32pm 
What happened to the personalities you could randomize? Now it's just the AI generated ones which end up being a bit much. I liked the simple ones and they worked well.
Juicy  [author] 5 Sep @ 7:20pm 
Added: now supporting dialogues for non-colonist pawns ! (configurable in settings)
󠀡󠀡 5 Sep @ 4:47pm 
thank you for the local support added koboldai works well and this mod does wonders with my 27B_Q8_0.gguf model
Juicy  [author] 5 Sep @ 4:22pm 
Refer to the setup guide in the discussion for detailed instructions on setting up local server
ƎNA 5 Sep @ 11:57am 
My friend, I join all the laudatory reviews below. Nevertheless, it would be cool if you added a little more specific instructions for launching the mod using LLM (For stupid and illiterate people (like me) in the field of AI. I spent an inexcusable amount of time trying to figure out why it doesn't work and how it should work).

The mod also seems to be incompatible with "Talking Isn't Everything".
Epoch_SoC 5 Sep @ 12:22am 
Still not supporting local pools. Local Provider does not support API key and Cloud Provider does not support custom BaseUrl.
Cosmosteller 4 Sep @ 10:29pm 
Hello, thanks to this amazing mod, RimWorld has become so much more fun. I’d like to suggest a new feature. How about adding a function that allows the player to directly input some text to notify the pawns of something? It would be similar to the perspective of a game master in a TRPG. For example, pressing a button could bring up a text input window where the player types a message and sends it. Currently, pawns can only react to basic information and the event box on the right side of the screen, but if players could intervene in this way, it would add more options and turn the game into an even richer story generator.
Juicy  [author] 4 Sep @ 2:21pm 
@SNAC Thanks! Feel free to reference my code (just credit me). You’re also welcome to contribute on my GitHub if you’d like.
Juicy  [author] 4 Sep @ 2:17pm 
@Crim now supporting it!
SNAC 4 Sep @ 9:52am 
Hello, juicy.

Thank you so much for creating such a wonderful mod. I'm truly impressed by your work.

I am developing another type of dialogue mod based on each pawn's memory, conversation topics, and a weighting of their desire for conversation. Would it be okay if I reference some of your code (such as the persona and API call methods) for ideas for my own mod?

I will be sure to credit you and specify which parts of your original work I referenced.

I am not a professional modder, just a regular web developer who enjoys playing RimWorld.
Master Bateman 4 Sep @ 8:11am 
If you can already set pawns' personalities individually, if would be nice if you can just make it so that we can turn off rimtalk on certain pawns, to save tokens for the pawns you really care.

All pawns are created equal, but some pawns are more equal than others.:stimulation:
Crim 4 Sep @ 2:58am 
Hey, it would be great if this support Deepseek model too for API key and as Cloud Provider.
Juicy  [author] 3 Sep @ 11:43pm 
New feature added: Auto-generate personas based on character profiles.
Juicy  [author] 3 Sep @ 11:42pm 
@Chips! Fixed! Pawns now only react to events happening nearby on their current map.
Juicy  [author] 3 Sep @ 3:45pm 
GitHub repository is now open for contributions! See the description for the link.
Juicy  [author] 3 Sep @ 3:39pm 
Now supports multiple Google Cloud models and local LLMs. Let me know if you run into any issues!
Kilted Weirdo 3 Sep @ 10:39am 
@kargan, turn down rate of messages. would most likely be your issue. second, check group size.

@kawwak hint, make a meta character and give them your name. use watcher=watched in command. enjoy.
KawwaK 3 Sep @ 9:29am 
So you can modify the prompt for each pawn individually in the Bio tab and that is not explained on the mod description? That's a game changer for me.
Kargan 3 Sep @ 9:28am 
I played for about 30 minutes, saw two AI dialogues during that time, and according to Google's log, used about 153,000 tokens. I left all the settings at their default values.
Kilted Weirdo 2 Sep @ 12:15pm 
5katz it works for those present on the map. i've had raiders yell already. Or at least if memory serves me right.

to anyone else. feel free to join on the add-on attempt with api prompts to showcase various techniques.
Zero 2 Sep @ 10:45am 
Google API is very restrictive on regions, so I actually would be very interested in trying to host this mod locally, or at least with other AI models that don't restrict access.
zecher 2 Sep @ 7:27am 
很好的模组!!!wonderful