TR

Vellium v0.3.5 Revolutionizes Local AI Writing with Native KoboldCpp and OpenAI TTS

Vellium, the open-source AI writing assistant for local LLMs, has launched its most significant update yet, introducing a reimagined writing mode, native KoboldCpp integration, and OpenAI-compatible text-to-speech. The release marks a major leap for indie authors and developers seeking privacy-first AI storytelling tools.

calendar_today🇹🇷Türkçe versiyonu
Vellium v0.3.5 Revolutionizes Local AI Writing with Native KoboldCpp and OpenAI TTS

Vellium v0.3.5 Revolutionizes Local AI Writing with Native KoboldCpp and OpenAI TTS

The open-source AI writing platform Vellium has unveiled version 0.3.5, a transformative update that redefines the landscape of local, privacy-focused creative writing tools. Designed for authors, worldbuilders, and AI enthusiasts who prioritize offline operation, Vellium now offers a fully reengineered writing mode, seamless native integration with KoboldCpp, and support for OpenAI-compatible text-to-speech (TTS) — all under an MIT open-source license. The update, announced by developer tg-prplx on Reddit’s r/LocalLLaMA community, represents a substantial leap forward in usability, performance, and creative control for users leveraging local large language models.

At the heart of the update is a complete overhaul of Vellium’s writing mode. Previously limited in structure, the new system introduces a comprehensive book bible — a centralized repository for worldbuilding elements such as character profiles, lore, timelines, and geography. Authors can now import entire manuscripts directly via DOCX files, and the platform automatically generates and caches summarized overviews of each chapter and character arc. The redesigned sidebar is significantly more compact, reducing visual clutter while increasing access to critical tools. Perhaps most innovative is the AI-powered character patch-editing feature, which allows writers to refine dialogue, motivations, or backstories by prompting the model to suggest edits based on established narrative consistency.

For users running local LLMs, the integration of KoboldCpp has been elevated from a third-party plugin to a fully native component. This means Vellium now natively supports advanced KoboldCpp features including provider:memory for persistent context retention, universal tagging for cross-model compatibility, and n-sigma sampling for improved output diversity. Payload structures have been standardized to align with official API specifications, resolving longstanding inconsistencies in model loading and response handling. Additionally, tool calling — a feature that can interfere with local model performance — is now automatically disabled in the UI when KoboldCpp is active, eliminating a common source of instability.

On the audio front, Vellium introduces an OpenAI-compatible TTS engine with a dedicated translation model, enabling users to generate spoken narration of their text in multiple languages without relying on cloud services. A new Zen Chat UI mode strips away all interface elements except the conversation window, creating a distraction-free environment ideal for deep writing sessions. The update also resolves long-standing bugs around project deletion and export reliability, including proper handling of inline scenes — a feature critical for nonlinear storytelling. Phrase bans now function as intended, and default badword filtering has been turned off to give users full editorial control.

Beneath the surface, Vellium v0.3.5 includes critical security and performance enhancements. Runtime data leaks have been patched, server bundle resolution within ASAR archives has been fixed, and local-mode security has been hardened against potential exploits. Multi-character chat interactions now respond more intelligently, with the system prioritizing replies from characters whose names are referenced in the prompt — a subtle but powerful improvement for narrative cohesion. The project’s official transition to the MIT license further encourages community contributions and commercial use without legal barriers.

According to the developer’s announcement on Reddit, the update bridges the gap between professional writing software and the flexibility of local AI models. Unlike cloud-based alternatives, Vellium ensures all data remains on the user’s machine — a critical feature for authors working with sensitive or unpublished material. With this release, Vellium has firmly established itself as the most robust, user-centric AI writing environment for offline creators.

Users can download Vellium v0.3.5 from its GitHub repository. The developer invites feedback and bug reports as the community prepares for future iterations, including potential support for multi-modal inputs and enhanced collaborative editing.

AI-Powered Content

recommendRelated Articles