Python 3.11, 3.12, and 3.13 on Windows x64. Linux and macOS targets are on the roadmap but not in the current beta.
The following Python features are not yet supported inside the VM: async/await, generators (yield), and context managers (with statements). Functions using these can be excluded from virtualization with the @vm_skip decorator and will run normally via CPython.
Yes. Your Python logic is virtualized and protected. Native C extensions such as NumPy, Pandas, Pillow, and SciPy are included in the build as-is and run at full native speed. They are not obfuscated, because they were never your source code to begin with.
The practical rule: if your proprietary logic is in Python, it is protected. If it delegates heavy math to NumPy, that call still runs at C speed.
No, not by default. The anti-VM / anti-sandbox detection is an optional hardening step that you enable per build. The standard release profile leaves it off, so your app runs normally inside cloud VMs, CI runners, and developer VMs.
Turn it on only when you distribute to end users and want to make analyst sandboxing harder.
No. PyVMProtect targets the ABI of a minor version, not a patch version. A .pyd built for cp311 works on every 3.11.x interpreter. You only need a new build when you move to a new minor line, for example 3.11 to 3.12.
For typical application code (business logic, JSON, Windows API calls) compiled modules run at 1.1–1.8× the CPython time. Pure compute loops (tight math, matrix ops) are much heavier through the VM, but they should not be virtualized in the first place. Exclude them via configuration and they run at native speed with ~0% overhead.
Cold start (import + first call) is 5–20 ms, amortized over the process lifetime.
C extensions (NumPy, SciPy, Pandas) are never virtualized and always run at native speed.
Measured on real builds: a compact module (~80 LOC) compiles to about 268 KB. That baseline is mostly the fused VM interpreter and security machinery, not your code. A 1,600-line module reaches roughly 1.5 MB. Heavier mutation profiles will push this further.
Full numbers with the benchmark methodology are on the Technology page.
During the current beta, built .pyd files are not signed by default, so Windows SmartScreen may warn on first run. Automatic signing by "PyVMProtect Systems" is planned before general availability. Enterprise customers will be able to sign with their own EV certificate.
Yes. Uncaught exceptions produce a normal Python traceback with line numbers and the original exception type and message. In the hardened release profile, variable names inside the traceback are obfuscated, but the structure stays readable.
A dev profile is available when you want completely clean traces for debugging.
It is held only long enough to run the build and let you download the protected output, then deleted from disk. It is not archived, not backed up, and not used for training. Full details are in the privacy policy.
A self-hosted compiler is available on request for enterprise customers that cannot send source code to a third party. It runs as a container inside your own infrastructure. Contact us if you need this.
A determined, skilled reverse engineer with unlimited time can always make progress on any compiled binary. PyVMProtect does not claim to be unbreakable : no software protection does. What it does is raise the cost of analysis to the point where attacking your code is no longer economically rational.
A recent hands-on analysis of a protected crackme found that the analyst could not recover the protected logic. They resorted to intercepting the program's output : which means the internal algorithm was never exposed. For most protection goals (proprietary algorithms, licensing, SaaS anti-repackaging), that is the correct outcome.
Every build gets a unique opcode remapping, encrypted bytecode blob, randomized dispatch table, and epoch-based instruction key rotation. An analyst reversing one binary gains nothing on the next.
print) to intercept decrypted data?
▾
No, not since the 1.3 runtime. The VM now checks every name resolved from the builtins namespace at load time. If a built-in function has been replaced with a Python function : the classic builtins.print = hook monkey-patch : the VM detects it and raises a security violation before the call is ever made.
This covers print, input, eval, exec, open, and every other entry in the builtins namespace. Native C built-ins always remain PyCFunction objects; a Python function in that slot is by definition a replacement.
The VM raises a RuntimeError and terminates execution. On Windows, a modal error dialog appears alongside the terminal message so the event is visible regardless of how the binary was launched. The decryption seed is corrupted and the in-memory bytecode is wiped before exit, so a memory dump taken after detection is not useful.
The checks run across multiple layers: PEB flags, hardware debug registers, kernel debug port, RDTSC timing, API hook detection, and a watchdog thread that monitors its own liveness. An attacker who suspends the watchdog is detected within approximately three seconds.
Obtaining a trusted EV (Extended Validation) code-signing certificate requires a formal identity verification process that takes several weeks. We are in that process now. Until it completes, Windows SmartScreen may display a warning on first run for binaries downloaded from the internet.
This does not affect the security of the protected code itself : the .pyd has its own internal integrity check that runs independently of the OS signature chain. Signing is an OS-level trust signal for distribution, not a component of the VM's tamper resistance.
Enterprise customers who cannot wait can provide their own EV certificate and we will apply it at build time.
The VM runtime is closed-source. However, even if an analyst had the full C++ source, it would not meaningfully help them reverse a protected binary. The opcode table, dispatch cookie, epoch boundaries, bytecode encryption keys, and CFI handler layout are all generated fresh per build from a 128-bit random seed. There is no static mapping to study.
This is the same principle as AES: the algorithm being public does not compromise the key.