The timer is a key aspect in the engine of my game. I designed a core principle of my game around "every 10 subtract 1 from alterable value a". 10 presumably being 10 milliseconds. I'd been testing the game at 60fps for months flawlessly but recently decided trying it at 30fps and even 120fps. When the framerate changes, the timer fails and is completely different to what it was doing at 60fps. I didn't think this was supposed to happen which is why I specifically chose to use the timer for so many events. Is the timer supposed to be dynamic with the fps or am I missing somethine?? Very frustrated and confused!