Posts by JimJam

Welcome to our brand new Clickteam Community Hub! We hope you will enjoy using the new features, which we will be further expanding in the coming months.

A few features including Passport are unavailable initially whilst we monitor stability of the new platform, we hope to bring these online very soon. Small issues will crop up following the import from our old system, including some message formatting, translation accuracy and other things.

Thank you for your patience whilst we've worked on this and we look forward to more exciting community developments soon!

Clickteam.

    There used to be a couple MFAs that had some examples on how to use the TileMap extension, but I can't seem to find them. Luckily, I had combined them all into a single MFA at some point, and here's that MFA: (attached below) It should have examples on how to do animations.

    Now here's an issue that I'm having, and I can't find a solution. So its possible to load tilesets from a File into the TileMap object. Then, when you save the TileMap into a .map file, the path to the tilesets is also saved into the .map file. My only problem is that I can not figure out how to re-load the tilesets from the .map file. When I load the .map file in Fusion, and use an expression to get the count of tilesets, it always reports 0, as if the tilesets are not loaded.

    If I open the map in a hex editor, I can see that the paths to the tilesets are stored in a "TILE" block in the binary.

    EDIT: I can not get an image to work on the forum for some reason, here's a link: Please login to see this link.

    The "TILE" block is highlighted in red here. My solution to get around not being able to load the tilesets is to use the "Map Properties" instead, and just manually save the filenames to properties called "T0" and "T1" etc for each tileset. Then upon loading the map, I run a fastloop to check for these properties, and add a new tileset for each one. Only then will external tilesets work. This is obnoxious though, because its doubling the same data in the file.

    There has got to be some way to actually read the data in the "TILE" block to get tileset paths, otherwise why would we even have the option to save that data in the first place? In the example files, I couldn't find anything about how to save/load external tileset files this way. The examples only had loading external maps with predefined tilesets in the Fusion editor.

    Also, is there any documentation on how this TileMap file format works? I can see that its binary, and I know how to use a hex editor only a little bit. But beyond that, Its a bit of a mystery to me how to get information out of it.

    Weird. To reset the debugger window settings, run regedit and delete the following key:

    HKEY_CURRENT_USER\Software\Clickteam\Fusion Developer 2.5\Debugger

    Let me know if you can reproduce the problem later.

    Hey this is a sort of tangential question but... Does the Steam version also save its registry info in "HKEY_CURRENT_USER\Software\Clickteam\" ?

    I'm writing a batch file to backup all my installed programs' preferences/settings, just in case of hard drive failure or something. And I don't want to miss anything that might be in the registry.

    I did some tests
    if you use multiple tilemap objects (single tilemap for each) and multiple viewports you get 1000fps
    also app properties > machine independent speed makes a huge difference

    Interesting. I tried some of your suggestions. I'm writing this as I test things. Here's what I got:

    ---

    I checked Machine Independent Speed ON, and that alone seemed to increase frame rate. But in a sort of odd way. Fusion's debugger reports a pretty consistent 60 fps, but visually the screen looks like its dropping frames (the game appears stuttery). So I don't think that's a real solution. So it seems the underlying game engine is running at a smooth 60 fps, but the screen refresh rate is stuttering around maybe 40 fps (my best guess).

    If I bump up the FPS on my game to 1000, I get around 100 fps on my frames with the Tile objects in it. I have one level in my game where the FPS drops down to 40 (when its set to 1000). This only seems to happen when I move the player to a particular underground section of my map (so the slow down must have something to do with scrolling + multiple tilesets overlapping)

    ---

    UPDATE: If I turn "VSync" on, it seems to lock the game at a maximum 60 fps, even if the Framerate is set to 1000. It also appears very smooth and consistent. But if I change the Framerate back down to 60, I get stutters and drops down into the 40s again.

    ---

    I can't test the "multiple tilemap objects (single tilemap for each)" right now, because that would require completely redesigning how I setup my game. But even if that does work, it feels like a sad work-around -- one of the main benefits of the TileMap object is the ability to store multiple layers, tilesets, and basically all your level's map data in one object. Having to use multiple objects feels hacky, and probably increases RAM usage too.

    But I'll try to test this out later if I can.

    ---
    ANOTHER UPDATE:
    I tried testing out a "mixed installation" of the TileMap.mfx and TileMapView.mfx files. What I mean by this is:

    - I installed the old version of TileMapView.mfx, which has a "Date Modified" value of "11/19/2016 5:48 PM"
    - This date matches up exactly with the last commit Looki made on 11/19/2016: Please login to see this link.
    - I installed the newer version of TileMap.mfx, which has a modified date of "4/6/2019 11:51 PM"

    Interestingly, when I run my game now (with the new TileMap.mfx, and old TileMapView.mfx) the frame rate is perfect, smooth, and stable... BUT... The Tilemap viewport won't render any other tilesets except the first one it finds. The parts of my level that used a second tileset are now just blank.

    This confirms what I originally suspected: There's nothing wrong with the TileMap.mfx extension, it is the TileMap Viewport extension that is bugged. This makes sense since if you consider that the TileMap object is basically just a specialized data array -- it doesn't have any built-in graphical components.

    The TileMap Viewport object is the one that is rendering the graphics to the screen (and thus is the extension interacting with your GPU).

    Now the odd thing is, that the GitHub for the TileMap object has a last commit of 4/7/2019 (which lines up with the file date on the mfx extension):
    - Please login to see this link.

    But the TileMap Viewport's GitHub page says the last commit was by Looki in 2016... but the newer MFX file does have a modified date of "4/6/2019"


    Conclusion:
    If Yves was the one to update the TileMap and TileMap Viewport extensions, it appears that he forgot to commit/push the source code to the GitHub page (but only forgot for the TileMap Viewport extension). So we (as a community) have no clue what was changed in that extension that is causing this bug, therefore we can't really fix it.

    On top of that, the bug causing frame drops, lag, and issues with more than 1 tileset seems to be related to this TileMap Viewport object specifically. My guess is that it may be some small inefficiency when the object tries to load/access a second image texture.

    Yves Could you please push the TileMap Viewport changes you made to the public GitHub, so maybe someone can spot the bug and fix it? Without being able to see that source, we're in the dark.

    Oh I didn't know it was open source. I was having trouble finding any info on it through google.
    For anyone interested in developing the extensions, I found the Github pages.

    The TileMap object: Please login to see this link.
    The Tilemap Viewport: Please login to see this link.

    TiledMapLoader object: Please login to see this link.

    From my experience, the slow down doesn't seem to be due to the TileMap or TiledMapLoader objects, but specifically from the Viewport object. So that might be a good place to focus.

    I don't really know anything about C++, so fixing GPU-level optimizations would be way beyond my knowledge.

    I've been using the older version of the TileMap & Viewport extensions for years (the one that had installed before today says copyright 2016 Looki in the description box), and I've never had any issue with it. So I'm 100% certain that the older version does not have these performance issues.

    Hmm, that's really disappointing.

    I guess I'll just downgrade the extension then. No real point to using the newer one with performance that bad. I mean, the whole point of the Tile Map Viewport is that its super fast and efficient. With the older version, it performs much better than any alternative methods of rendering a tile map in Fusion ("add backdrop" method, using hundreds of Actives, Active Picture object, Surface Object, etc).

    Oops, I forgot to check back on this thread.

    I downloaded the DX11 version, and performance with this seems significantly worse. If I set my Fusion game to DX9 mode, even very simple tile maps bog down the whole engine to 30 fps. Even in DX11 mode, I get significant lag and poor performance (40 fps, slow down, etc).

    So my only option is to use the old version made by Looki, and stick to DX9. With Looki's old version (last updated 2016) I get solid 60fps with large tile maps at 1080p. No stuttering, no dipping, no lagging.

    I'm not even sure what to say.

    I just noticed that the Tile Map Viewport won't display any tiles if you have your application set to Direct3D 11. Direct3D 9 does work though. Is there a version of this extension updated for D3D 11 and later?

    I checked the Extension Manager, and it said it was "up to date", but I know that sometimes newer versions of extensions don't always make it into the Manager.

    MuddyMole - Awesome work, I've had your Onedrive link bookmarked for a while now from some other thread. There's a lot of great stuff in there to sift through.

    And yeah, I agree that for tiles it's probably too much to manually handle them in this way. I just brought this up because it's more relevant to the this topic at hand (Castlevania Map). I'm sure Fusion itself (or the tilemap extension) handle this sort of task much more efficiently than making up an array system in Fusion . But for particle systems, it can definitely be useful to have limitations in place.

    That said, the Tilemap extension would be best bet to handle the actual static tiles. But this doesn't touch on the issue of actual Active Objects (the enemies, items, etc in your level). You'll still need to figure out some way of handling how these things should behave when off screen. Should you load them all at once and keep them in memory? Should you create some system to dynamically load them? Should you just let Fusion handle it all by default?

    It really depends on what the objects are doing, and how you want them to behave.

    I've had some issues before with letting Fusion just handle everything. For example, for items or powerups that have little to no behavior (just a static Active that the player collides with), just letting Fusion deactivate them tends to work fine. But I've had issues with enemies falling through the floor when off screen, or enemies pausing (getting deactivated) in weird spots when outside the screen. So I think how to handle these objects need to be evaluated case-by-case.

    seep - Definitely check out the Tilemap object in the Extension Manager. For the actual background tiles, it's incredibly efficient and dynamic.

    Good point Muddymole. Yeah, just loading in new chunks without ever un-loading old ones, would just eventually cause major slow down and fill up your RAM. But to be able properly mange that and unload areas, you'd need some sort of direct access to the memory. And to my knowledge, Fusion handles the memory allocation all on its own behind the scenes, so you don't have to deal with that stuff. In most cases, this is great because it allows rapid prototyping and less "micro-managing" from us. But for large data sets like that, yeah it can be an issue.

    One neat trick that I've found: you can sort of create your own "RAM" management, by using a finite number of objects with an ID alt value, and re-order them based on context. This way you always have the same number of objects loaded into memory at all times (no lag, because Fusion isn't constantly creating and destroying new objects, thus having to constantly add/remove things from RAM).

    An example of this would be: let's say that your character kicks up a little dust cloud particle every time they walk. And after about 1 second, the dust cloud particle fades away and would normally be destroyed. So using normal create/destroy method, you'd be creating (and destroying) a new object every 1 second. And let's say that this dust particle is only kicked up 4 times every second (according to your character's walk animation). This means that at most, there will only ever be 4 particles in existence at once.

    Since you know exactly how many particles there will ever be, instead of doing the create/destroy object method -- you can just "cycle" through 4 pre-made particles that are always in the frame. When a dust particle reaches the end of it's lifespan, you simply make it "invisible" and maybe store it off screen. If you give each particle object a unique ID (ranging from 0-3) you can just use a Global Value to +1 the "particle index" so you know which is the next particle to "create" (reappear). When loop your Particle Index Global Value back to 0 when it reaches the "end" you simply apply MOD math = "ParticleIndex = (ParticleIndex + 1) mod 4"

    If you use this method, you also completely avoid any potential bugs where (for example) you accidentally have a particle effect that continually creates 10,000 objects until your game crashes!
    The downside to this method is that there will be a hard-limit to the number of particles on screen - but this limit can be whatever you want it to be, and you simply have to make a good estimate on how many particles you need.

    For example, if you have a character that kicks up 4 dust particles per second, with a lifespan of 1 second, you'll only ever need a max of 4 particles objects. If you have 2 characters on screen, you'd need a max of 8 particles. If you want to create a sort of safe "buffer zone" for extra particles, you can just increase the limit a little further than you'd need, (+5) or even just double the limit (16).

    ---

    So a more relevant example of this would be for a level screen. Let's say you know that there could only ever be a possible 100 tiles on screen at once in your game. You could use 100 Actives, and simply reposition them when the screen scrolls, and reload a new "tile" animation. Since there physically can not be more than 100 tiles on screen, you don't need to "create" new chunks. And again, if you wanted some buffer room there, you could increase your limit to 200 so that your game is loading the tiles on screen, and a little ahead and behind you.

    To the best of my knowledge, this is how most old SNES-era games worked. They'd use a sort of "array" to manage all the objects and tiles on screen. So when you killed an enemy, or they went off screen, their slot in the array would be "freed up" to load a new enemy. This is how they worked within the memory and sprite limits of the hardware at the time. But using those same techniques on today's hardware also help make our code much more efficient to run.

    The reason I'm asking is that I'm curretly building all my levels in one frame that is loaded in "chunks" at start of level, so every object is not there at the same time but the size of the level is the same, and it's somewhat large (15168x8256), but I haven't noticed any slowdowns yet.

    To my knowledge, this technique you're using should be fine. "Chunking" levels and loading them on-the-fly is a pretty standard game technique, used in all sorts of games, ranging from Zelda, to Grand Theft Auto, and probably most famously Minecraft. But notably, these games "stream" the chunks in as needed (only the chunks that the player is in, and the ones immediately around them). Loading all the "chunks" in at the start of the level kind of defeats the point of chunking it up in the first place.

    But if you haven't noticed any slowdown so far, you're probably fine.

    It might be good idea to find someone with a 10 year old low-end laptop (or if you have one yourself) to do some "limit testing" for your game. You'll get a better idea of what things cause slow down on a slower machine.

    One thing I learned in general is that video games are not built on "perfect code" but "good enough hacks" that no one will really notice. One of my favorite little "tricks" was in Mario Galaxy -- they needed to have a certain door in a certain level make a text note popup. But instead of programming a new object that Mario could "read" they just hid an already programmed Sign Post object behind the door.

    Please login to see this link.

    Fusion is a powerful tool - you could probably build keyloggers and trojan malware with it. Please don't forget that there is no real coding needed and its quick and easy to learn.

    People are not always good. If you have a good game out on the market and I would be your main competitor, I would pull all the dirty tricks I could get - including submitting your game as potential threat. There are tons of sites on the net that offer shareware wrapped with malware installers. I could upload your game and report the file. Repeat that a few times and have fun with the reports you get from your loyal customers.

    Not that I would do that personally. Just a thought.

    Okay so leaving all the bad stuff aside, here is a good read about the general problem: Please login to see this link.

    Wow, that's an interesting read. Yeah, I'm aware that false positives have been a problem with AV since forever. But that situation described in the link is one of the most frustrating things I've read. All his files come up clean, but then he builds his program into the installer, and BAM: false positive. Makes no sense!

    And of course all these AV engines are basically a black-box to us -- we as software makers have no insight into what criteria they use to trigger these virus alerts.

    I just downloaded the new Fusion 292.27 build, and created a new EXE of a blank frame. Microsoft Defender no longer flags blank Fusion apps as malware, but I'm still getting 18 false positives.
    Please login to see this link.

    Most of the major AV programs read it as clean, but a few of the bigger ones (Avast, BitDefender, Kaperskey,AVG) reads it as malware. And a few are still picking up "Zusy / Key Logger".
    What's frustrating though as a Fusion user is that unlike the blog you linked, I can't strip down my EXE's code to figure out what is triggering this stuff - because its already a blank MFA. I guess that's just he cost of ease-of-access, and not building one's program totally from scratch in C++ or something.

    Regardless, having to convince people who download my EXE that there isn't a keylogger in it, is probably not very re-assuring to them.

    These Anti-virus companies basically get to write the rules on who is a "legitimate" developer or not. The AV software we have available kinda sucks, but the alternative of using no AV isn't better. So we just have to deal with it. :/

    I just ran a test. I opened Fusion build 292.26 (steam version), and created a new application. It's got one frame, totally blank, nothing changed. I built the program "test.exe" and saved to my desktop.

    It is immediately flagged and quarantined by Windows Defender. When I run it through VirusTotal, it gets 22 positive results, including 3 accounts of "Key Logger". The rest of the virus engines report "undetected."

    While any program being detected as "malware" is annoying, being detected as "Key Logger" is a little more serious, and naturally would scare the **** out of anyone using one of our programs. Especially since its also flagging as "Zusy" - which a quick google search tells me is Please login to see this link.. If I download any random software that flags as "keylogger" my natural instinct is "this thing is trying to steal my passwords and possibly access my bank accounts."

    Now obviously, these Fusion programs aren't actually malware, and its a false positive. But the severity of a KeyLogger and identity theft trojan (even if false-positive) is very serious.

    What is happening inside of a blank fusion app that triggers a "Key Logger" alert in all of these virus engines? It can't possibly be any sort of Fusion Extension, or any events, because this is a blank default new MFA file (built into an exe).

    I wish these anti-virus programs would tell us a little more info about how or what exactly they are "detecting".

    Nice, though a bit outdated, lots of things have changed from MMF2, for example for the portability, C.F. 2.5 games can be ported to a large number of platforms including consoles.

    I recently noticed that the XLua extension seems to be partially broken in CF 2.5. Particularly the module that interfaces with Fusion/objects directly, which is what made something like Baba possible. It's been documented in this thread: Please login to see this link.

    I imagine some minor change to Fusion's internals over various updates caused the XLua to break. Who would I have to talk to get it this fixed?

    I would also suggest that Clickteam go 100% in and make the XLua extension an official extension of Fusion. I've been playing with it only for a little while, but after seeing Baba, I'm convinced that the Lua + Fusion is an absolute powerhouse combo. I've already been able to improve a few of my Fusion-made software with it.

    It's just as Arvi Teikari said: Lua's advantages and Fusion's advantages compliment each other really well.

    Just wanted to ring in that I have this same problem. I noticed that the mmfi example for XLua doesn't work. The example is supposed to move the enemies in a basic platform movement using a Lua script, but nothing happens. It's as Erkabubben said, the whole XLua-MMFI feature just isn't working, which breaks a huge core of what makes XLua useful in Fusion.

    I'm guessing some small change in the internals of Fusion during an update must have broken this feature. I imagine the fix would be some minor edit to the XLua extension's source code, like some variable/function name change or something.

    I was really looking forward to controlling CTF's objects via lua scripts, like Baba is You does. But I'm guessing Baba works because it's built on an older version of MMF2, not CTF 2.5.

    Who should I talk to about fixing this extension?
    Or, does anyone know which previous version of Fusion worked with XLua? I'd be willing to roll back if necessary.