Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
tngtech/deepseek-r1t-chimera:free
microsoft/mai-ds-r1:free
nvidia/llama-3.1-nemotron-ultra-253b-v1:free
I was struggling with error messages before, but after following this information,
it works perfectly with my API key and deepseek/deepseek-chat-v3-0324:free.
Really appreciate the help!
You were right, I just needed to get rid of ":free"
"Free Models: Some models are available for free, but even these have strict rate limits (typically 20 requests per minute and 200 requests per day"
I tried it one more time and the same thing happened.
The API key was set to V3 and unlimited tokens, so I have no idea why it doesn't work.
https://openrouter.ai/api
KEY
deepseek/deepseek-chat-v3-0324:free
In URL I should type: https.openrouter/keys (etc) (I can't copy and paste text in the game, so I'll just have to type it maually)
My API key from it below.
And Deepseek V3 at the MODEL?
"Recent cyberattacks have targeted DeepSeek v3, including a critical database leak that exposed over a million records, including system logs, user prompts, and API tokens.
Security researchers have identified flaws in DeepSeek v3's AI models, making them susceptible to jailbreaking techniques. This could potentially allow malicious actors to manipulate the system into generating harmful content or exposing confidential data.
Some data transmissions within the platform have been found to occur without proper encryption, increasing the risk of interception"
Also generation seems shorter then it was and thats well due to how the dev updated the game I think I cant change the context/output token size