Is there any particular reason as to why MMF2 cannot calculate anything quicker than 1/50th (0.02) of a second?
Thanks!
Is there any particular reason as to why MMF2 cannot calculate anything quicker than 1/50th (0.02) of a second?
Thanks!

Change the frame Rate in the application properties and the event loop will go faster.
The default is 50 fps.
Thanks.
I've got it set to 60 fps w/VSYNC on a 60hz monitor.
My problem is that I am experimenting with the rhythm game formula, but I believe I need to calculate to hundreths of even thousandths of a second. Is this possible?

I don't know if this is the right tool for precision down to a thousandth of a second.
Sorry
Thanks.
That said, is 100ths of a second possible at 100 fps? (Considering that at 50fps 1/50th of a second is possible.)





Well, it's not like other rhythm games use more than 60 FPS or so. I'm not sure why you'd need that.
You are given the milliseconds (i.e. 1/1000 seconds) elapsed since the beginning via "timer".
That way you can pretty much calculate anything time-based that you want.


Just remember, the game won't be detecting events or updating the screen any faster than your frame rate, and at 60 FPS, thats a precision of 16-17 ms between each tick. While MMF2 can give you the amount of milliseconds elapsed at each frame, all its going to really be doing is be incrementing by 16-17 ms between each frame, you'll never see ticks at higher resolutions than that. You could adjust the frame rate to above 60 FPS, but in general this is not a good solution, as most computers will not handle updating the graphics that quickly.
I mean, lets say you have a game that runs at 50 FPS. You could have a counter that is set to "timer" each frame. You'll see it go from 0 to 20 to 40 to 60 to 80 to 100 and so on
Now you could just have a counter that says "always: add 20 to counter" and you'd get the same effect in terms of precision of milliseconds; it will just jump 0 to 20 to 40 to 60 and so on
The difference with the timer object is that its detached from the process, using your computers chronometer and the windows calls or whatever. So it will perform slightly differently- the timer will keep adding up in real time, whereas your project might run at slightly less than 50 FPS, experiencing random lags and slow frame draws, causing it to dip just below what the timer reads; over time they will become desynchronized.
So that can be both a blessing and curse, and you need to evaluate what you're running it on. For example, if you wanted to use it as a stopwatch for your project, maybe the timer is more important since it is more accurate in terms of real time. But if you want to see how many seconds your main character has been sitting in a pit of fire, using values might make more sense, since someone with low frame rate might experience in 5 frames what he should have experienced in 60, giving him less in-game reaction time while having the same human reaction time. Meanwhile, if you want to see where you are on a soundtrack, particularly for samples, you could just refer directly to its duration & position on the track straight through the sound object; it can retrieve the information on the currently playing sample