Frequently Asked Questions
Everything you need to know about Cloud Development Environments, remote workspaces, and platform engineering
General CDE Questions
A Cloud Development Environment (CDE) is a remote workspace that runs on cloud infrastructure instead of your local laptop. Instead of cloning code to your machine and installing dependencies locally, you connect to a fully configured environment running on AWS, Azure, GCP, or on-premises Kubernetes. Your IDE (like VS Code or JetBrains) connects remotely, but the heavy lifting happens in the cloud. In 2026, CDEs have also become the default runtime for AI coding agents, providing isolated sandboxes where autonomous tools can safely write and test code without touching production systems.
CDEs are most common in these scenarios:
- Regulated industries - Healthcare (HITRUST), finance (SOC 2), and government (FedRAMP) where source code cannot touch local laptops
- Platform engineering teams - Organizations standardizing developer environments across large engineering teams
- Remote-first companies - Teams where contractors and vendors need secure, temporary access to codebases
- AI/ML teams - Developers working with GPU-intensive workloads that exceed laptop capabilities
- AI agent operators - Teams running autonomous coding agents (like Claude Code, Devin, or Copilot Workspace) that need sandboxed environments for safe code generation and testing
Consider CDEs if you are experiencing any of these pain points:
- New developers take 2-5 days to get their environment working
- "Works on my machine" issues slow down development velocity
- Compliance audits require proving code never leaves your VPC
- Developers need more CPU/RAM than laptops can provide
- Lost or stolen laptops create data exfiltration risks
- AI coding agents need isolated, reproducible sandboxes to safely generate and test code
No, but they are related. A DevContainer is a configuration file (devcontainer.json) that defines what tools and dependencies should be installed in a development environment. A CDE is the platform that actually runs that container in the cloud.
Think of it this way: A DevContainer is the recipe, a CDE is the kitchen. Most CDE platforms (Codespaces, Ona (formerly Gitpod), Coder) support DevContainers as a way to define environment configurations.
While both involve remote access, CDEs are purpose-built development platforms with key differences: environments are defined as code (Terraform/DevContainers) ensuring reproducibility, they integrate directly with your local IDE for a seamless experience, they include lifecycle management (auto-start, auto-stop, auto-delete), and provide central administration for teams. A plain SSH server requires manual setup and lacks these developer-focused features.
Both - and increasingly, neither. Most CDEs support multiple connection methods: VS Code (via Remote SSH extension) running on your laptop but connected to the cloud workspace, JetBrains Gateway for IntelliJ/PyCharm users, and web-based IDEs (like code-server or VS Code for the Web) accessible directly in the browser. Developers can choose their preferred workflow - the experience feels local, but the code execution happens remotely. In 2026, a growing number of CDE sessions are also headless - driven by AI agents that connect via CLI or API without a human IDE at all.
Technical Questions
Container-based CDEs (like Ona, Codespaces) run workspaces as Docker containers or Kubernetes pods. They are lightweight, start in seconds, and cost less. However, they share a host kernel and may be limited for workloads requiring nested virtualization or full OS access.
VM-based CDEs (like Microsoft Dev Box, some Coder configurations) provide full virtual machines with complete OS flexibility. They can run Windows, Linux, or macOS, support GUI apps, and handle any workload. The tradeoff is higher cost and slower startup times (minutes instead of seconds).
MicroVM-based CDEs are the emerging middle ground in 2026. Technologies like Firecracker and Cloud Hypervisor spin up lightweight VMs in under a second while providing full kernel-level isolation. Platforms like Ona and Daytona have adopted microVMs to combine the speed of containers with the security boundaries of VMs - particularly important for running untrusted AI-generated code.
Most CDEs support:
- VS Code - via Remote SSH extension (most popular)
- JetBrains IDEs - via JetBrains Gateway (IntelliJ, PyCharm, WebStorm, etc.)
- Web IDEs - Browser-based VS Code (code-server) for zero-install access
- SSH/Terminal - Standard terminal access for vim, emacs, or command-line workflows
- AI-native editors - Cursor, Windsurf, and Zed support remote SSH connections, bringing AI-assisted coding to cloud workspaces
Performance depends on several factors:
- Faster compute - Cloud workspaces can have 32+ CPU cores and 128GB+ RAM, far exceeding typical laptops
- Faster builds - Parallel compilation, large builds, and Docker layer caching benefit from cloud resources
- Network latency - IDE responsiveness depends on internet connection quality
- File operations - Large file transfers to/from workspace can be slower than local disk
Most developers report equal or better performance for CPU-intensive tasks (builds, tests, Docker), with acceptable latency for typing and editing on modern internet connections (50+ Mbps).
No, CDEs require an internet connection to access your workspace. This is the primary tradeoff compared to local development. However, some platforms offer workarounds:
- Hybrid mode - Tools like DevPod let you switch between cloud and local workspaces
- Local fallback - Keep DevContainer configs that work both remotely and locally
- Mobile hotspot - Many developers use phone tethering as backup connectivity
This depends on your platform's configuration:
- Auto-stop after idle time - Most platforms pause workspaces after 30 minutes to 2 hours of inactivity to save costs
- Persistent storage - Your code and changes are saved; reconnecting resumes where you left off
- Manual stop/start - You can manually stop workspaces to avoid idle charges, restarting when needed
It depends on your organization's policy:
- Open workspaces - You have sudo/admin access and can install anything (like your own laptop)
- Locked templates - Platform teams pre-install approved tools; you cannot modify the base image
- User-layer installs - You can install tools in your home directory without system-wide changes
Most organizations use locked templates for consistency and security, with a process to request new tools be added to the standard image.
Nix is a purely functional package manager that creates reproducible, declarative development environments. In 2026, several CDE platforms have adopted Nix as an alternative (or complement) to Docker-based DevContainers:
- Deterministic builds - Nix guarantees the exact same packages and versions every time, eliminating "it worked yesterday" issues
- No Docker overhead - Nix environments run natively without container layers, reducing startup time and resource usage
- Composable stacks - Developers can layer multiple language toolchains (Node.js + Python + Rust) without Dockerfile complexity
- Platform support - Ona uses Nix natively via devfile.yaml, and Coder supports Nix through custom Terraform templates. Devbox (by Jetify) is another popular Nix-based tool for CDE configuration.
The protocol connecting your local IDE to the remote workspace directly impacts latency, responsiveness, and feature support:
- SSH - The universal standard. VS Code Remote SSH and terminal editors use it. Reliable but basic - no built-in port forwarding UI or workspace awareness.
- JetBrains Gateway (RD Protocol) - A proprietary protocol optimized for IntelliJ-family IDEs. Handles rich IDE features like debugging, refactoring, and indexing with server-side processing.
- VS Code Tunnels - Microsoft's managed tunnel service that works through firewalls without SSH configuration. Used by Codespaces and Dev Box.
- WireGuard/Tailscale - Several CDEs (including Coder) use WireGuard-based mesh networking for encrypted, low-latency peer-to-peer connections between your device and the workspace.
For best results, choose a CDE whose protocol aligns with your IDE. If your team uses mixed editors, SSH provides the broadest compatibility.
MicroVMs are lightweight virtual machines that boot in milliseconds while providing hardware-level isolation. They have become a defining technology for CDEs in 2026:
- Firecracker - Originally built by AWS for Lambda, now used by CDE platforms to spin up isolated workspaces in under 125ms with minimal memory overhead
- Cloud Hypervisor - An open-source alternative backed by Intel and Microsoft, optimized for cloud-native workloads
- Security benefit - Unlike containers that share a host kernel, microVMs give each workspace its own kernel, preventing container-escape attacks
- AI agent safety - MicroVMs are ideal for running AI-generated code because a rogue process cannot escape the VM boundary, making them the preferred isolation layer for autonomous development workflows
Security & Compliance Questions
CDEs provide several security advantages:
- Data exfiltration prevention - Source code never touches local disks, preventing theft via lost/stolen laptops
- VPC isolation - Code stays within your cloud network, never traversing public internet
- Access controls - Centralized authentication (SSO, MFA) and permission management
- Audit logging - Complete visibility into who accessed what code and when
- Instant revocation - When an employee leaves, delete their workspace immediately
- AI agent containment - Sandboxed environments prevent AI coding agents from accessing production systems, leaking secrets, or making unreviewed network calls
CDEs can help achieve compliance, but it depends on your implementation:
- Self-hosted CDEs (like Coder) run in your own AWS/Azure, so YOU control data residency and compliance
- Managed CDEs (like Codespaces, Ona) have their own SOC 2 certifications, but you must verify they meet YOUR requirements
- HITRUST CSF - Healthcare organizations often require self-hosted CDEs in HITRUST-certified infrastructure
- GDPR - Data residency (EU regions) and right-to-deletion are achievable with proper configuration
For regulated industries, self-hosted CDEs (Coder, Microsoft Dev Box) are preferred for maximum control.
Most enterprise CDEs log:
- Authentication events - Every login, logout, and failed authentication attempt
- Workspace lifecycle - Creation, start, stop, deletion timestamps
- Code access - Which repositories were cloned and when
- Resource usage - CPU, RAM, disk consumption for cost attribution
- Admin actions - Template changes, permission grants, and policy updates
Logs typically integrate with SIEM systems like Splunk, Datadog, or CloudWatch for compliance reporting.
Yes, this is a common use case. CDEs provide:
- Time-limited access - Workspaces auto-delete after contract end date
- Scoped permissions - Contractors only see repositories they need
- No code download - They can edit code but cannot clone to their personal machines
- Session recording - Some platforms can record terminal sessions for audit trails
This is why architecture matters:
- Self-hosted CDEs - Your code never leaves your AWS/Azure account. A breach of the CDE software does not expose your code.
- Managed CDEs - Your code is stored on the vendor's infrastructure. Evaluate their SOC 2 reports and incident response history.
- Mitigation - Use VPN or private networking, encrypt data at rest and in transit, and regularly rotate credentials.
High-security organizations (government, finance, healthcare) almost always choose self-hosted options like Coder or Microsoft Dev Box.
Yes, enterprise CDE platforms support:
- SSO providers - Okta, Microsoft Entra ID (formerly Azure AD), Google Workspace, Auth0
- SAML 2.0 - Standard enterprise authentication protocol
- OIDC (OpenID Connect) - Modern OAuth-based authentication
- MFA enforcement - Require TOTP, SMS, or hardware keys for all logins
Cost & ROI Questions
Pricing varies by platform and usage:
- Managed CDEs - GitHub Codespaces costs around $0.18/hour for 2 cores, $0.36/hour for 4 cores. Ona offers usage-based pricing on a similar scale. Daytona offers a managed tier alongside its open-source self-hosted option.
- Self-hosted CDEs - You pay cloud compute costs (AWS EC2, Azure VMs) plus the platform license. Coder offers a free tier for small teams and enterprise pricing starting around $20/user/month. DevPod is free and open source with no licensing cost.
- Local development - Appears free, but factor in high-spec laptop costs ($2,500-$5,000 in 2026), setup time, security risks, and the growing cost of AI agent compute on local hardware
Most organizations see net savings due to reduced onboarding time (90% faster), auto-stop cost optimization (workspaces only run when used), and avoiding expensive laptop purchases. Typical ROI payback is 6-12 months.
Consider these often-overlooked costs:
- Platform engineering time - Building and maintaining Terraform templates requires dedicated staff
- Network bandwidth - If developers frequently transfer large files, egress costs can add up
- Storage costs - Persistent volumes for each developer (100GB-1TB) accumulate quickly
- Training and adoption - Developers need time to adapt to remote workflows
- Zombie workspaces - Developers forgetting to delete old workspaces incurs unnecessary costs
- AI agent compute - Autonomous coding agents can run workspaces for hours unattended, generating unexpected compute bills if not governed by quotas
Auto-stop (also called TTL - Time To Live) automatically shuts down idle workspaces to save compute costs. Common configurations:
- Idle detection - After 30-120 minutes of no keyboard/mouse activity, workspace pauses
- Daily shutdown - Workspaces stop at 6pm and restart when developer logs in next morning
- Weekend cleanup - All workspaces stop Friday evening, saving 48 hours of compute
Organizations report 50-70% cost savings with aggressive auto-stop policies versus 24/7 running workspaces.
ROI comes from several areas:
- Onboarding speed - Reduce setup from 3-5 days to 1 hour. For a 100-person team, this saves 300-500 developer-days per year.
- Laptop savings - Avoid high-end laptops ($2,000+). Issue lightweight Chromebooks ($300) instead.
- Reduced environment drift - Eliminate 2-4 hours per week debugging "works on my machine" issues.
- Security incident avoidance - Preventing one code leak can save millions in regulatory fines.
- AI agent productivity - CDEs enable safe, parallel AI-assisted coding workflows that multiply developer output without additional hardware investment.
Typical enterprise ROI: $500K-$2M annual savings for teams of 50-200 developers. Organizations running AI agent workflows in CDEs report additional 20-40% productivity gains.
Choose managed CDEs if:
- You want zero maintenance and instant setup
- You have a small team (less than 50 developers)
- Your compliance requirements are minimal
Choose self-hosted CDEs if:
- You need HITRUST, FedRAMP, or strict data residency
- You have a large team (100+ developers) where per-user costs matter
- You want full control over infrastructure and customization
Most platforms support cost attribution through:
- Workspace tagging - Tag workspaces by team, project, or cost center for AWS/Azure billing reports
- Usage metrics - Track CPU-hours, storage GB, and network egress per team
- Quotas - Set team-level budgets and automatically stop workspaces when exceeded
- Integration with FinOps tools - Export to Cloudability, CloudHealth, or Kubecost for chargeback
Implementation Questions
Timeline varies by platform and complexity:
- Managed CDEs (Codespaces, Ona) - 1-2 weeks for basic setup, 1-2 months for enterprise configuration
- Self-hosted CDEs (Coder) - 2-4 weeks for pilot, 2-3 months for full production deployment
- Template development - 1-2 weeks per major stack (Node.js, Python, Java, .NET)
- SSO/compliance integration - Add 2-4 weeks for SAML, audit logging, and VPN setup
Typical staffing for self-hosted CDEs:
- Small team (less than 50 devs) - 1 platform engineer (part-time, 20-40% capacity)
- Medium team (50-200 devs) - 1-2 platform engineers (full-time)
- Large team (200+ devs) - 2-4 platform engineers plus 1 SRE for production support
Responsibilities include: maintaining Terraform templates, onboarding new teams, troubleshooting workspace issues, and monitoring cost/performance.
Phased rollout is strongly recommended:
- Phase 1 - Pilot (2-4 weeks) - 5-10 volunteer developers from a single team
- Phase 2 - Early adopters (1-2 months) - Expand to 25-50 developers across 2-3 teams
- Phase 3 - General availability (3-6 months) - Open to all developers, optional usage
- Phase 4 - Mandatory (6-12 months) - Require CDEs for new hires, contractors, or high-security projects
This approach lets you refine templates, gather feedback, and build internal champions before wide adoption.
Top failure modes:
- Poor network connectivity - Developers on slow/unreliable internet have a terrible experience
- Incomplete templates - Missing tools force developers to hack around limitations
- Lack of training - Developers resist change when they don't understand benefits
- Cost surprises - No auto-stop policies lead to budget overruns and executive backlash
- Forcing immediate migration - Requiring all developers to switch on day one causes revolt
Resistance is natural. Effective strategies:
- Start with volunteers - Find early champions who love it and evangelize internally
- Hybrid approach - Allow CDEs for heavy workloads, local dev for quick edits
- Showcase benefits - Demo faster onboarding, more powerful builds, zero setup time
- Mandatory for new hires only - Let existing devs continue locally, but new employees start with CDEs
- Compliance mandate - If security/audit requires it, it becomes non-negotiable
Yes, migration is straightforward:
- Git-based workflows - Push local commits, then clone into CDE workspace. No data loss.
- DevContainer configs - If you already use DevContainers, they work identically in CDEs
- Dotfiles sync - CDEs support dotfile repositories to preserve shell aliases, vim configs, etc.
- IDE settings - VS Code settings sync and JetBrains Settings Repository carry over automatically
Tool-Specific Questions
GitHub Codespaces:
- Managed SaaS by GitHub, runs on their infrastructure
- Container-based only (no VMs)
- Deeply integrated with GitHub repositories
- Zero setup, instant start
Coder:
- Self-hosted in YOUR AWS/Azure/GCP account
- Supports containers, VMs, and bare metal via Terraform
- Works with any Git provider (GitHub, GitLab, Bitbucket, etc.)
- Requires platform team to manage infrastructure
Ona's story has changed over time (the company was formerly known as Gitpod):
- Previously - Ona (as Gitpod) was open source with a self-hosted option
- Currently - Ona focuses on their managed SaaS product. Self-hosted support was discontinued in 2023.
- Alternative - For self-hosted Ona-like experience, consider Coder or Daytona (which IS fully open source)
Microsoft Dev Box is Windows-first but supports Linux:
- Windows 11 VMs - Native support, optimized for .NET, Visual Studio, and desktop apps
- WSL2 - Run Linux distributions inside Windows VMs via Windows Subsystem for Linux
- Linux VMs - Limited preview support for Ubuntu and other distributions (check Azure docs for current status)
For pure Linux workflows, Coder or Ona are better choices.
Daytona has matured significantly and is a leading open-source CDE platform in 2026:
- Fully open source - Apache 2.0 license with a growing community and enterprise backing
- DevContainer native - Built around the DevContainer spec with excellent compatibility
- Self-hosted and managed - Deploy on any cloud or on-premises infrastructure, or use Daytona's managed offering
- AI agent ready - Daytona has positioned itself as infrastructure for AI coding agents, with SDK support for programmatic workspace creation and management
Best for: Teams wanting open-source self-hosting, AI agent sandbox infrastructure, or an alternative to Coder's complexity and Ona's SaaS constraints.
Yes, many organizations run hybrid CDE strategies:
- Codespaces for open source - Public repos and quick experiments
- Coder for production - Proprietary code requiring strict compliance
- Dev Box for Windows teams - .NET developers needing Visual Studio
- Ona for frontend - React/Vue teams preferring browser-based IDEs
The key is standardizing on DevContainer configs so workspaces are portable across platforms.
Google Cloud Workstations is GCP's managed CDE offering:
- Managed by Google - Runs on GCP infrastructure, fully integrated with Cloud Console
- Container-based - Uses Cloud Shell-like environments with pre-installed gcloud CLI, kubectl, etc.
- IDE support - VS Code, JetBrains Gateway, or Cloud Code for VS Code
Best for: Teams already on GCP who want minimal setup and native integration with BigQuery, Kubernetes Engine, and other Google services.
AI Agents & Autonomous Development
AI coding agents are autonomous software tools powered by large language models (LLMs) that can write, test, debug, and refactor code with minimal human supervision. Examples in 2026 include Claude Code, Devin, GitHub Copilot Workspace, Cursor Agent, and AWS Kiro Developer Agent. They need CDEs because:
- Isolation - Agents execute arbitrary code. Running them on a developer's laptop or in production is dangerous. CDEs provide throwaway sandboxes.
- Reproducibility - Agents need consistent environments with the right toolchains pre-installed to reliably build and test code.
- Parallelism - You can spin up 10 or 50 CDE workspaces simultaneously, letting agents work on multiple tasks or branches in parallel.
- Cost control - CDE auto-stop policies prevent runaway agents from burning compute budget indefinitely.
There is a spectrum of AI involvement in software development:
- AI-assisted (copilot mode) - A human writes code with AI autocomplete suggestions. The developer stays in control. Examples: GitHub Copilot, Codeium, Tabnine.
- AI-directed (agent mode) - A human describes a task, and an AI agent writes, tests, and iterates on the code autonomously. The human reviews the output. Examples: Claude Code, Devin, Copilot Workspace.
- Fully autonomous - AI agents receive tasks from CI/CD pipelines, issue trackers, or other agents with no human in the loop. This is emerging in 2026 for well-scoped tasks like dependency updates, test generation, and bug triage.
CDEs become increasingly critical as you move toward autonomous development because the AI needs a safe, ephemeral environment to operate in - not a developer's laptop or a shared staging server.
LLMOps (Large Language Model Operations) is the practice of deploying, monitoring, and governing LLM-powered tools in production workflows. For CDE teams, LLMOps concerns include:
- Prompt and context governance - Controlling what code and secrets AI agents can access within a workspace. CDEs with scoped permissions prevent agents from reading repositories they should not see.
- Token and cost tracking - Monitoring LLM API spend per workspace, per team, and per project. CDE tagging and cost attribution integrate with LLMOps dashboards.
- Audit trails - Logging what an AI agent did inside a CDE workspace (commands run, files modified, tests executed) for compliance and debugging.
- Model versioning - Pinning agent behavior to specific model versions so workspace outcomes are reproducible across runs.
AI agent integration has become a key differentiator for CDE platforms:
- Daytona - Purpose-built SDK for AI agents. Provides programmatic workspace lifecycle management, making it a popular choice for teams building custom agent pipelines.
- GitHub Codespaces - Deep integration with Copilot Workspace. Agents can create, modify, and test code in Codespaces directly from GitHub issues and pull requests.
- Coder - API-first architecture with Terraform templates lets teams define agent-specific workspace configurations. Strong headless workspace support for CI/CD-triggered agent runs.
- Ona - Offers ephemeral workspace APIs and microVM isolation that align well with agent workloads requiring fast spin-up and strong security boundaries.
Securing AI agent access within CDEs requires multiple layers:
- Scoped credentials - Inject only the minimum secrets needed for the task. Use short-lived tokens that expire when the workspace stops.
- Network policies - Restrict outbound network access from agent workspaces. Block internet access or allow only approved registries and API endpoints.
- Read-only source mounts - Let agents read code but require changes to go through pull requests, preventing direct commits to protected branches.
- Workspace-level isolation - Use microVM-backed workspaces so a compromised agent process cannot escape to other workspaces or the host system.
- Human-in-the-loop gates - Require human approval before agent-generated code is merged, deployed, or promoted beyond the sandbox.
Yes, and this is one of the most powerful patterns in 2026. Modern CDE platforms expose APIs that let AI agents programmatically:
- Create workspaces - Spin up a fresh environment for a specific task, branch, or pull request
- Execute commands - Install dependencies, run tests, execute builds, and analyze results
- Read and write files - Modify source code, create new files, and commit changes
- Destroy workspaces - Clean up after the task completes, leaving no lingering compute costs
This enables patterns like: an AI agent receives a GitHub issue, creates a CDE workspace, writes a fix, runs tests, opens a pull request, and tears down the workspace - all without human intervention. Daytona's SDK and Coder's API are the most mature options for this workflow.
In 2026, the AI-augmented SDLC uses CDEs at multiple stages:
- Planning - AI agents analyze codebases in CDE workspaces to estimate effort, identify dependencies, and suggest implementation approaches
- Development - Human developers and AI agents work in parallel CDE workspaces, with agents handling routine tasks (boilerplate, tests, migrations) while humans focus on architecture and business logic
- Code review - AI reviewers spin up CDE workspaces to actually build and test pull requests, going beyond static analysis to catch runtime issues
- Maintenance - Autonomous agents handle dependency updates, security patches, and technical debt reduction in isolated CDE workspaces, submitting PRs for human approval
Still Have Questions?
Explore our in-depth guides or reach out to our team for personalized advice on implementing cloud development environments.
