How to Use Spark Profiler to Diagnose Minecraft Server Lag & Performance

What Is Spark?
Spark is a performance profiling plugin/mod for Minecraft servers (Forge, Fabric, or Bukkit/Spigot-based) that helps you diagnose lag, memory issues, TPS drops, and other bottlenecks.
Key features:
- Real-time TPS / MSPT / CPU / memory monitoring
- Profiling of server threads, method calls, and memory allocations
- Heap dumps / memory summaries / GC info
- Web-based viewer that presents the profiling data in a readable format
Spark is widely used by hosting providers to assist users with performance diagnostics.
Installation of Spark
The installation process depends on your server type (modded / plugin). Here’s how it’s typically done:
Server Type | Steps to Install |
|---|---|
Download the Spark plugin JAR (from SpigotMC or official Spark) and place it in the | |
Download the Spark mod JAR version matching your Minecraft and mod loader version. Upload it to the |
After placing the JAR, restart your server fully to activate Spark.
Basic Spark Commands & Usage
Once Spark is installed, you can use specific commands to profile and inspect server performance.
Common commands
Command | Purpose |
|---|---|
| Begin profiling (default mode) |
| Stop profiling and generate a report (you’ll get a link) |
| Check whether the profiler is currently running |
| Run profiler for a set number of seconds |
| Profile all threads |
| Profile memory allocations instead of CPU usage |
Other useful commands:
/spark healthreport— full snapshot of server health (TPS, memory, CPU, disk, JVM args)/spark gc— analyze Garbage Collection activity/spark heapsummary— get a memory summary / snapshot
Profiling Workflow & Best Practices
- When to profile - Run Spark while the lag or performance issue is happening (or soon after) so that Spark can record the relevant activity.
- Duration
- For general profiling, 30–60 seconds is enough to gather meaningful data.
- If lag is sporadic, use flags like
--only-ticks-over <ms>to only record slow ticks.
- Stopping & retrieving report - After stopping the profiler, you’ll receive a URL (via console or chat). Open that in your browser to view the profiling report.
- Reading the report
- The report shows threads, call frames, execution times, and percentages.
- Expand the “Server thread” section to see which operations (e.g. tick, plugins, entity updates) are consuming time.
- Hover over nodes to see ms values; percentages show relative cost.
- Use the flame graph or “flat view” to isolate performance hot spots.
- Share with support or devs
You can share the generated report URL for help diagnosing issues.
Common Use Cases & Examples
- Consistent TPS drop → run
/spark profiler start --timeout 30→/spark profiler stop→ inspect which tasks are dominating time - Lag spikes / occasional stutters → use
/spark profiler start --only-ticks-over 70 --timeout 60(or adjust threshold) - Memory leaks or GC issues → use
/spark heapsummaryor/spark gcand review memory usage breakdowns - Server health checks →
/spark healthreportfor an overview of system state
Example (Step-by-Step)
- Install Spark as plugin or mod (see above).
- Wait until lag or performance degradation is happening.
- Run in console:
/spark profiler start --timeout 30
- Wait 30 seconds.
- Run:
/spark profiler stop
- Copy the link provided in console.
- Open link to view report.
- Expand “Server thread” and follow nodes with high % to identify troublemakers (plugins, entity ticks, chunk loads, etc.).
- Use memory or GC commands if needed.
- Use the info to optimize your setup, remove heavy plugins, or share with support.
Troubleshooting & Tips
- If no report link appears, ensure permissions are correct and Spark loaded successfully.
- If profiling doesn’t collect data, try longer duration or different flags (e.g.
--thread *). - If class/method names are obfuscated, enable deobfuscation mappings in the Spark viewer.
- Use the profiler selectively, not continuously — while Spark is lightweight, constant profiling can have minor overhead.
- Don’t forget to restart the server after installing Spark to ensure it’s fully loaded.
- Use multiple profiles if your issue is intermittent.

Updated on: 13/10/2025
Thank you!
