For example, imagine that test A is 15% faster than test B. 15% is a very significant performance increase (for a player, it might mean a very welcome difference of 52 vs 60). If you're testing at around 100fps, then you'll see it unmistakably: 100 vs 115fps. But if you're testing at around 15fps, then that same 15% should produce around 2.25fps difference. After rounding, and with some random fluctuations and background system activity thrown in, that difference would tend to float around 1-2fps, or in some cases could even be masked by something else. The result would be that a significant 15% performance increase would to you seem like a "very very small gain".
In this case, perhaps it is a very very small gain, but we can't really know unless you give your test more FPS to work with (or use a more precise measurement than FPS, such as recording the timer after each tick).
But I'm fine with that, because I'm targeting PC gamers, the majority of whom have dedicated GPUs (or if not, then they usually know to keep their expectations low in many games). My PC (4770k @ 3.5Ghz + GTX 980ti) is on the the high-end, though it's ageing. I also test on a gaming laptop, as well as an old PC with very modest hardware (ancient core i5, budget-level GPU from several generations back) and it runs fine (60fps except in poorly optimised areas). Spryke's minimum system requirements will be similar (though probably even lower) than most standard AAA games. That perhaps makes it a little unusual in Fusion/retro circles, but well within the expected norm otherwise.