VeloServe: A High-Performance Web Server written in Rust
TL;DR: I spent a day building VeloServe, a high-performance web server with embedded PHP support, using Cursor Pro and Claude Opus 4.5. It's now open source and you can try it in one command.
Visit: https://www.veloserve.io for more details.
The Beginning of a Wild Idea
I've been frustrated with the traditional web server stack for years. Nginx is great, Apache is reliable, but setting up PHP-FPM, configuring sockets, tuning workers... it's a lot. What if there was a single binary that just worked and at same time you could use it to scale with any Cloud provider or data center or at local dev purpose.
That's when I decided to build VeloServe — a modern web server written in Rust with PHP embedded directly into it. No PHP-FPM. No separate processes. Just speed.
The Setup: Cursor Pro + Claude Opus 4.5 + Ona.com
Here's my development setup that made this possible:
Cursor Pro with Remote SSH
I use Cursor as my primary IDE. It's VS Code on steroids with AI built-in. The game-changer? Remote SSH connections. I connected Cursor to an Ona.com workspace (formerly Gitpod), which gave me a fully configured development environment in the cloud.
Why Ona.com? It spins up a complete Linux environment with Docker, all the tools I need, and most importantly — I can close my laptop and pick up exactly where I left off from any device.
Claude Opus 4.5 as My Pair Programmer
The real magic happened with Claude Opus 4.5 through Cursor's AI features. But here's the thing — I didn't just blindly accept AI suggestions.
Every piece of Rust code, I verified against the official Rust documentation. Every Tokio async pattern, I cross-referenced with the Tokio docs. Every Hyper HTTP handling, I checked against Hyper's examples.
This is what I call Vibe Coding — you work with the AI, not for it. The AI suggests, you verify, you refine, you ship.
VeloServe is a web server that:
- Runs PHP inside itself — no external PHP-FPM process
- Written in Rust — memory safe, blazing fast
- Supports two modes:
- CGI Mode — uses php-cgi, works everywhere
- SAPI Mode — PHP embedded via FFI, 10-100x faster
- WordPress/Magento ready — intelligent caching, clean URLs
- Single binary — just download and run
The Numbers
Performance Comparison:
🚀 VeloServe (SAPI Mode)
~10,000 requests/sec ~1ms latency 5x faster than traditional setups
⚡ Nginx + PHP-FPM (traditional)
~2,000 requests/sec ~10ms latency Industry standard baseline
🐌 VeloServe (CGI Mode)
~500 requests/sec ~50ms latency Compatibility mode, works everywhere
The Development Journey
Day 1: Core HTTP Server
We started with the basics — a Tokio-based async HTTP server using Hyper. The initial commit was just serving static files with proper MIME types, ETag headers, and conditional requests.
I kept the Hyper migration guide open in another tab the entire time. Hyper 1.0 changed a lot, and Claude's training data didn't always have the latest patterns. Always verify.
Day 2: PHP Integration
This is where it got interesting. We implemented PHP execution in two ways:
CGI Mode — spawn php-cgi for each request, pass environment variables, pipe POST data through stdin
SAPI Mode — use Rust FFI to link against libphp.so and execute PHP in-process
The FFI work was tricky. We had to:
- Create a
build.rsthat detects PHP installation viaphp-config - Write FFI bindings for
php_embed_init(),php_execute_script(), etc. - Handle the PHP lifecycle correctly
I spent hours in the PHP Internals Book and the Rust FFI guide making sure we weren't going to cause memory leaks or segfaults.
Day 3: WordPress Demo & Deployment
The ultimate test — can it run WordPress? We set up:
- WordPress with SQLite (no MySQL needed for demos)
- Automatic URL detection for cloud environments
- One-click deployment on Ona.com
When I saw the WordPress installation wizard load through VeloServe with ~1ms PHP execution time, I knew we had something special.
Try It Yourself
One-Line Install
curl -sSL https://veloserve.io/install.sh | bash
Quick Test
mkdir -p /tmp/mysite
echo '<?php phpinfo();' > /tmp/mysite/index.php
veloserve start --root /tmp/mysite --listen 0.0.0.0:8080
Visit http://localhost:8080 and you'll see PHP running through VeloServe.
Also a lot of useful CLI commands you can find on our Documentation pages or in Readme on Github repo:
https://github.com/veloserve/veloserve?tab=readme-ov-file#cli-tool
You can start ready-made WordPress demo:
https://github.com/veloserve/veloserve?tab=readme-ov-file#wordpress-demo-features
Try in the Cloud (No Install)
Don't want to install anything? Try it instantly:
What I Learned
1. AI is a Force Multiplier, Not a Replacement
Claude helped me write code 10x faster, but I still needed to understand what the code was doing. When we hit a Windows build issue with Unix-only signals, I knew immediately how to fix it with #[cfg(unix)] because I understood Rust's conditional compilation.
2. Always Verify Against Official Docs
AI models are trained on data that can be outdated. The Rust ecosystem moves fast. Always have the docs open:
- docs.rs for crate documentation
- The Rust Book for language features
- Official project documentation for frameworks
3. Cloud Development Environments Are Game-Changers
Working through Cursor's Remote SSH to Ona.com meant:
- Consistent environment across devices
- No "works on my machine" problems
- Easy to share and reproduce
- Powerful cloud hardware for compilation
4. Ship Early, Iterate Fast
I went from zero to a working web server with WordPress support in one day. It's not perfect — the SAPI mode needs more work, we need better error handling, and there's always more optimization to do. But it works, it's open source, and the community can help improve it.
Resources
- Website: veloserve.io
- GitHub: github.com/veloserve/veloserve
- Documentation: github.com/veloserve/veloserve/tree/main/docs
- Configuration Reference: docs/configuration.md
- Environment Variables: docs/environment-variables.md
The Stack
For those curious about the exact setup:
- IDE: Cursor Pro with Remote SSH
- AI: Claude Opus 4.5 (via Cursor)
- Cloud Environment: Ona.com (Gitpod successor)
- Language: Rust 1.75+
- Key Crates: Tokio, Hyper, tokio-rustls
- Website Hosting: Vercel
- Domain: veloserve.io
What's Next
VeloServe 1.0.0 is just the beginning. On the roadmap:
- Complete SAPI mode implementation
- FastCGI protocol support
- HTTP/3 (QUIC)
- Built-in Let's Encrypt
- Configuration hot-reload
- Prometheus metrics
Development Roadmap is here:
https://github.com/veloserve/veloserve?tab=readme-ov-file#%EF%B8%8F-development-roadmap
If you're interested in contributing or just want to try it out, head to veloserve.io and give it a spin.
And if you build something cool with it, let me know on Twitter/X or open an issue on GitHub!
https://github.com/veloserve/veloserve
Good luck, happy testing!