Instalación nativa con PowerShell — Sin Linux, sin WSL2
Native installation with PowerShell — No Linux, no WSL2
# Windows 21H2+ y drivers NVIDIA# Windows 21H2+ and NVIDIA drivers winver nvidia-smi # Si falla: nvidia.com/Download → RTX 4060 → Win 11# If it fails: nvidia.com/Download → RTX 4060 → Win 11
Administrador de tareas → Rendimiento → CPU: "Virtualización: Habilitada". Si no, activala en BIOS (SVM / VT-x).
Task Manager → Performance → CPU: "Virtualization: Enabled". If not, enable in BIOS (SVM / VT-x).
winget install --id Microsoft.PowerShell --source winget # Reiniciá PowerShell 7 y usalo en todos los pasos# Restart PowerShell 7 and use it for all steps $PSVersionTable.PSVersion
winget upgrade --all winget install --id Git.Git -e --source winget winget install --id Microsoft.WindowsTerminal -e
winget install --id OpenJS.NodeJS.LTS -e # Reiniciá PowerShell 7# Restart PowerShell 7 node --version # v24.x.x# v24.x.x npm --version # 10.x.x# 10.x.x
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
winget install --id Docker.DockerDesktop -e
Docker Desktop → Settings → General: Use WSL 2 | Resources → Advanced: 8 GB RAM
Docker Desktop → Settings → General: Use WSL 2 | Resources → Advanced: 8 GB RAM
docker --version docker run hello-world
winget install --id Ollama.Ollama -e Invoke-RestMethod -Uri "http://localhost:11434/api/tags" nvidia-smi # Variables recomendadas: OLLAMA_GPU_OVERHEAD=512MB, OLLAMA_MAX_VRAM=7680MB# Recommended variables: OLLAMA_GPU_OVERHEAD=512MB, OLLAMA_MAX_VRAM=7680MB
# Como Administrador# As Administrator [System.Environment]::SetEnvironmentVariable("OLLAMA_GPU_OVERHEAD", "512MB", "Machine") [System.Environment]::SetEnvironmentVariable("OLLAMA_MAX_VRAM", "7680MB", "Machine") Restart-Service -Name "ollama" -ErrorAction SilentlyContinue
npm install -g openclaw@latest openclaw --version # >= 2026.1.29# >= 2026.1.29
Versiones anteriores tienen vulnerabilidad crítica (CVSS 8.8). Actualizá con npm update -g openclaw@latest.
Earlier versions have critical vulnerability (CVSS 8.8). Update with npm update -g openclaw@latest.
openclaw onboard
Wizard: Provider Ollama → URL http://127.0.0.1:11434 → modo Cloud + Local → daemon Yes → auth token → DM pairing.
Wizard: Provider Ollama → URL http://127.0.0.1:11434 → mode Cloud + Local → daemon Yes → auth token → DM pairing.
# 8B~5.5GB | 7B~4.5GB | 14B~8+2GB | 4B~2.5GB VRAM# 8B~5.5GB | 7B~4.5GB | 14B~8+2GB | 4B~2.5GB VRAM ollama pull qwen3:8b-q4_K_M ollama pull qwen2.5-coder:7b-q4_K_M ollama pull qwen2.5-coder:14b-q4_K_M ollama pull qwen3:4b-q4_K_M ollama list ollama run qwen3:8b-q4_K_M "tool calling test OK"
# 1. aistudio.google.com → Get API Key → Create in new project # 2. Ejecutá como Admin ("User" = solo tu usuario):# 2. Run as Admin ("User" = your user only): [System.Environment]::SetEnvironmentVariable("GEMINI_API_KEY", "AIza-TU-KEY-AQUI", "User") $env:GEMINI_API_KEY # Reabre PowerShell antes de verificar# Reopen PowerShell before verifying
notepad "$env:USERPROFILE\.openclaw\openclaw.json" # o: code "$env:USERPROFILE\.openclaw\openclaw.json"# or: code "$env:USERPROFILE\.openclaw\openclaw.json"
{
"gateway": {
"bind": "loopback",
"port": 5189,
"auth": { "mode": "token" },
"mDNS": { "mode": "minimal" }
},
"messages": {
"dmPolicy": "pairing",
"groupPolicy": "mention"
},
"tools": {
"allow": ["read", "write", "edit", "web_search", "fetch"],
"deny": ["exec", "browser", "camera", "screen"],
"execApproval": true
},
"agents": {
"defaults": {
"model": {
"primary": "ollama/qwen3:8b-q4_K_M",
"fallbacks": [
"ollama/qwen2.5-coder:7b-q4_K_M",
"google/gemini-2.0-flash"
]
},
"modelOptions": {
"temperature": 0.1,
"contextWindow": 32768
},
"maxConcurrent": 2,
"workspace": "C:\\Users\\TuUsuario\\.openclaw\\workspace",
"compaction": { "mode": "safeguard" }
}
},
"models": {
"providers": {
"ollama": {
"baseUrl": "http://127.0.0.1:11434",
"apiKey": "ollama-local",
"models": [
{ "id": "qwen3:8b-q4_K_M", "name": "Qwen3 8B", "reasoning": true, "contextWindow": 32768, "cost": { "input": 0, "output": 0 } },
{ "id": "qwen2.5-coder:7b-q4_K_M", "name": "Qwen Coder 7B", "reasoning": false, "contextWindow": 32768, "cost": { "input": 0, "output": 0 } },
{ "id": "qwen2.5-coder:14b-q4_K_M", "name": "Qwen Coder 14B", "reasoning": false, "contextWindow": 32768, "cost": { "input": 0, "output": 0 } },
{ "id": "qwen3:4b-q4_K_M", "name": "Qwen3 4B Rápido""Qwen3 4B Fast", "reasoning": false, "contextWindow": 32768, "cost": { "input": 0, "output": 0 } }
]
},
"google": {
"apiKey": "${GEMINI_API_KEY}",
"models": [
{ "id": "gemini-2.0-flash", "name": "Gemini 2.0 Flash (Gratis)""Gemini 2.0 Flash (Free)", "contextWindow": 1000000 }
]
}
}
}
}
Reemplazá TuUsuario en "workspace" por $env:USERNAME.
Replace TuUsuario in "workspace" with $env:USERNAME.
openclaw gateway restart openclaw status
# NTFS# NTFS $path = "$env:USERPROFILE\.openclaw" $acl = Get-Acl $path $acl.SetAccessRuleProtection($true, $false) $rule = New-Object System.Security.AccessControl.FileSystemAccessRule( $env:USERNAME, "FullControl", "ContainerInherit,ObjectInherit", "None", "Allow" ) $acl.AddAccessRule($rule) Set-Acl $path $acl # Firewall# Firewall New-NetFirewallRule -DisplayName "OpenClaw Block External" ` -Direction Inbound -Protocol TCP -LocalPort 5189 -Action Block -RemoteAddress "0.0.0.0/0" New-NetFirewallRule -DisplayName "OpenClaw Allow Loopback" ` -Direction Inbound -Protocol TCP -LocalPort 5189 -Action Allow -RemoteAddress "127.0.0.1" # Auditoría# Audit openclaw security audit openclaw security audit --deep openclaw security audit --fix [System.Environment]::SetEnvironmentVariable("OPENCLAW_DISABLE_BONJOUR", "1", "User")
$PSVersionTable.PSVersion node --version docker --version nvidia-smi ollama list openclaw --version openclaw status $env:GEMINI_API_KEY openclaw security audit openclaw chat --model "ollama/qwen3:8b-q4_K_M" "Di hola""Say hello"
npm scripts deshabilitados: Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
npm scripts disabled: Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
Ollama no encuentra GPU: Actualizá drivers NVIDIA y reiniciá
Ollama doesn't find GPU: Update NVIDIA drivers and restart
openclaw no se reconoce: Reiniciá PowerShell 7 o $env:PATH += ";$env:APPDATA\npm"
openclaw not recognized: Restart PowerShell 7 or $env:PATH += ";$env:APPDATA\npm"
Variables no cargan: Reiniciá PowerShell después de setearlas
Variables don't load: Restart PowerShell after setting them
Docker no inicia: Habilitá virtualización en BIOS y wsl --update como Admin
Docker won't start: Enable virtualization in BIOS and wsl --update as Admin
Puerto 5189 en uso: netstat -aon | findstr "5189" y cambiá el port en openclaw.json
Port 5189 in use: netstat -aon | findstr "5189" and change port in openclaw.json
Path con espacios: Usá comillas y barras dobles: "C:\Users\Mi Usuario\.openclaw"
Path with spaces: Use quotes and double backslashes: "C:\Users\My User\.openclaw"