accuracy of the timer object

Welcome to our brand new Clickteam Community Hub! We hope you will enjoy using the new features, which we will be further expanding in the coming months.

A few features including Passport are unavailable initially whilst we monitor stability of the new platform, we hope to bring these online very soon. Small issues will crop up following the import from our old system, including some message formatting, translation accuracy and other things.

Thank you for your patience whilst we've worked on this and we look forward to more exciting community developments soon!

Clickteam.
  • I am going to base a score on number of seconds the player managed to play the game without loosing.
    It's a mobile game, so inconsistent frame rate across the different terminals would not be surprising.

    Is the standard "timer object:every x seconds" accurate enough to keep track of the seconds used, or should I find some other way to do this?

  • As a general rule:
    The timer generally keeps in sync with the system clock and will run fast if it falls behind to sync up. This means framerate drops will not delay the timer.

    Adding to a counter will delay the counter in framerate drops, therefore your counter may read 50 when 60 seconds of real time has passed.

    Most games will want to rely on the counter method as users could use purposely use framerate drops to 'cheat' the system.

    Please login to see this link.

  • I think there's a misunderstanding, Popcorn isn't suggesting using the Fusion variable Framerate, but rather, entering in your expected framerate.

    For example, if your framerate is the default 50, use the event:

    Always
    >Add 100.0/50 to counter

    And remember to simplify for better performance:
    Always
    > Add 2 to counter

    or create your own variable for framerate

    Users can can mess with the system clock, it's far less of a risk on mobile devices or html/flash as I'm sure there's protection for this. I'm not entirely sure how fusion copes with system clock changes, maybe one of the staff can comment on whether a player could potentially mess with it to gain an advantage when using timer events.

    Please login to see this link.

  • You can get the current framerate through an expression, "FrameRate", so always adding (100.0 / framerate) to a counter does in fact seem to be working as a seconds timer with a changing framerate.

    I tested this by comparing the standard "every 1 seconds" to popcorn's expression. I set MMF to alter its frame rate throughout the runtime. There is some delay between the two timers though.

    I've uploaded my example. I can't tell for sure which one is the most accurate.
    Please login to see this attachment.

  • Yes but it will allow you to accelerate the timer without the same number of frames taking place, drop framerate to 2 and you'll be adding 50 to the counter every frame.

    So with your above method, the player can play less of the game and get the counter higher. If the object of the game is to last longest, then the player with the low framerate can get a higher score by playing less of the game someone who has a high framerate.

    So it's still recommended to build it on a counter you add to that has nothing to do with the Timer object or the Framerate.

    If you absolutely must use the above method you'll need to build everything around timer events, but it would likely turn out glitchier than if you build everything based on actual frames, not timers and "FrameRate" division.

    Users can't skip frames but they can meddle with the framerate, I don't recommend giving your users an extra tool to manipulate your system by.

    Please login to see this link.

  • Well Every 1 Second should have the same result as 100.0/"FrameRate" given that both account for framerate dips. I'd be curious if there is any difference at all.

    My suggestion was to avoid timer movements all together, the frame has a checkbox for timer based movements, turn it off and frames will dictate your movement, not seconds. Then count up by adding (100.0 / expected framerate) to a counter.

    The question is really, do you want your application to run in slow motion when the framerate drops, or be choppy but run at the same speed. Both have their benefits.

    The issue I've encountered is with custom movements in the event editor, timer based stuff can get unpredictable due to it being variable whereas frame based stuff always has the same results and is very predictable. Hopefully someone else can add some insight as the best choice is not always obvious.

    One example I can think of was I was creating enemies every 5 seconds, however if the framerate dips the screen would become flooded with enemies as the user literally could not dispose of them fast enough, I wanted my game to be the same difficulty for people with less framerate so I made sure the game slowed down when the framerate dipped so the experience was the same for everyone, albeit slower for some people. Some games get easier when slowed down, but if the controls are tight, most people should beable to play it comfortably at normal speed.

    Please login to see this link.

    Edited 2 times, last by Ryan (August 25, 2014 at 12:52 PM).

  • Quote

    Every 1 Second should have the same result as 100.0/"FrameRate"


    I am seeing a clear difference. In my example, the "Every 1 seconds" seems to be the most stable Please login to see this attachment.

    Quote

    The question is really, do you want your application to run in slow motion when the framerate drops, or be choppy but run at the same speed. Both have their benefits.


    Yeah I'd like to keep the game at a consistent speed, so choppy is preferable.

Participate now!

Don’t have an account yet? Register yourself now and be a part of our community!