JetBrains AI Assistant, Junie, and Kineto Add GPT-5 Support

JetBrains ships GPT-5 as the default model across AI Assistant, Junie, and Kineto. Internal benchmarks show 1.5× to 2× improvements in code quality and task complexity handling over previous OpenAI models.

JetBrains AI Assistant, Junie, and Kineto Add GPT-5 Support

TL;DR

  • GPT-5 is now the default model in JetBrains AI Assistant, Junie, and Kineto
  • Internal benchmarks show 1.5× to 2× improvements in code quality and task complexity handling over previous OpenAI models
  • JetBrains had early access to test GPT-5 before public release
  • Available now in JetBrains IDE 2025.2+ via plugin updates

The Big Picture

JetBrains just became one of the first development tool vendors to ship GPT-5 integration across its AI product line. This isn't a gradual rollout or an opt-in beta — GPT-5 is now the default model powering AI Assistant's chat, Junie's coding agent capabilities, and Kineto's no-code generation.

The timing matters. OpenAI's GPT-5 launch represents the first major model upgrade since GPT-4, and JetBrains secured early access to test and tune the integration before public availability. According to Olivier Godement, OpenAI's Head of Business Products, JetBrains' feedback during early testing helped identify optimal API impact for developer workflows.

For developers already using JetBrains tools, this changes the baseline quality of AI-generated code. The company claims 1.5× to 2× improvements in code quality and task complexity handling compared to previous OpenAI models. That's not incremental — it's the difference between an assistant that needs constant supervision and one that can handle production-level tasks autonomously.

How It Works

GPT-5 integration varies across JetBrains' three AI products, each optimized for different use cases.

AI Assistant uses GPT-5 as the default chat model, balancing generation quality with cost optimization. The model handles everything from code explanations to full feature generation. In one example, a single prompt asking for an HTML5 prototype of a "JetBrains IDE in 2030" produced a fully responsive, styled, and interactive page — no iteration required.

Junie, JetBrains' LLM-agnostic coding agent, gets the most dramatic upgrade. The tool now defaults to GPT-5 but maintains support for other models through plugin settings. In testing, Junie was asked to create a hidden snake game as an Easter egg in a footer. It identified the injection point in a large codebase, accounted for dependencies and layout constraints, and delivered production-ready code with no bugs. That level of contextual awareness — understanding where code fits in an existing architecture — is where GPT-5 pulls ahead of earlier models.

Kineto, JetBrains' new no-code platform for generating websites and apps, also defaults to GPT-5. The model's strength in frontend tasks makes it particularly effective for single-purpose app generation from natural language prompts. Kineto is currently in Early Access, so GPT-5 support is rolling out to waitlist users first.

Under the hood, JetBrains maintains its multi-provider architecture. You're not locked into GPT-5 — the company supports multiple LLM providers across its AI products. This matters for teams with specific model preferences, compliance requirements, or cost constraints. GPT-5 is the default because JetBrains' internal benchmarks show it performs best, but you can switch models in plugin settings.

The integration requires JetBrains IDE version 2025.2 or later. Once updated, GPT-5 becomes available automatically through the AI Assistant and Junie plugins. No API key configuration or manual model selection is needed unless you want to override the default.

What This Changes For Developers

The practical impact depends on how you currently use AI coding tools. If you're using AI Assistant for code explanations or small refactors, you'll notice faster, more accurate responses. If you're using Junie for multi-file changes or feature implementation, the upgrade is more significant.

The snake game example illustrates the shift. Earlier models could generate a snake game if you provided detailed instructions about file structure, dependencies, and integration points. GPT-5 infers those details from codebase context. It understands that a footer Easter egg needs to avoid layout conflicts, load dependencies without breaking existing scripts, and integrate cleanly with the current build system.

This changes the level of specification required when working with coding agents. You can describe what you want in product terms — "add a snake game Easter egg to the footer" — rather than implementation terms. The model handles architectural decisions that previously required human judgment.

For teams already using JetBrains AI tools, the upgrade is automatic. That's both convenient and potentially disruptive. If your workflow depends on specific model behavior or output patterns, the switch to GPT-5 might require adjustment. JetBrains allows model switching in settings, but the default experience assumes GPT-5 is the right choice for most users.

Cost is another consideration. JetBrains describes GPT-5 as "balanced with optimized costs," but doesn't publish specific pricing. If you're on a JetBrains AI subscription, GPT-5 access is included. If you're using your own OpenAI API key, GPT-5 calls will hit your account at OpenAI's standard rates.

Try It Yourself

To enable GPT-5 in your JetBrains IDE:

  1. Update to JetBrains IDE version 2025.2 or later
  2. Navigate to Settings | Plugins
  3. Update AI Assistant and Junie plugins to the latest versions
  4. GPT-5 will be set as the default model automatically

To verify which model you're using, open AI Assistant chat and check the model selector in the interface. To switch models in Junie, open plugin settings and select your preferred LLM provider.

For Kineto access, join the waitlist at kineto.dev. The platform is in Early Access Program, so availability is limited.

The Bottom Line

Use GPT-5 in JetBrains tools if you're already invested in the JetBrains ecosystem and want the best available model without switching platforms. The 1.5× to 2× improvement in code quality is real, especially for complex, multi-file tasks where context awareness matters.

Skip it if you're not on JetBrains IDE 2025.2+ or if you've standardized on a different LLM provider for compliance or cost reasons. The multi-provider architecture means you're not forced to use GPT-5, but you'll need to manually override the default.

The real risk here is model lock-in. JetBrains positions itself as LLM-agnostic, but defaulting to GPT-5 across all products signals a preference. If OpenAI changes pricing, API terms, or model availability, JetBrains users feel it immediately. The ability to switch models mitigates this, but most developers will stick with the default. Watch how JetBrains handles future model transitions — that will reveal whether multi-provider support is a real strategy or just a hedge.

Source: Junie