Perhaps this has been asked before since it seems really simple and I'm a bit frustrated at myself for fumbling over it.. but I'm building something that is using the timer to increase a value every 1/100th second.. but the actual time is way off.. I'm assuming the timer is directly related to frame rate because when I make the frame rate 100fps, then the millisecond count is accurate.
There must be a way to get an accurate timer at 60fps (since I'm going to be exporting to iOS)... or is it safe to keep it at 100 if that's the only way?
I've been reading about and messing around with Vsync (sets frame rate to refresh rate) and the Timer based movements property.. but the only successful combination I've found is framerate: 100, vsync: off, timer-based movements: [seems accurate regardless of setting]
Am I missing something? or is the timer object really just a frame counter that is independent of actual time?