Skip to content

Conversation

@krinor
Copy link

@krinor krinor commented Feb 16, 2021

Hi!
I'm Krille from the vcfed.org forum.

I don't have TASM so don't know if this even assembles (my preferred assembler is NASM) but I think it should work.

Additional ideas for optimizations;

  1. Move the interrupt handler to offset 5Ch and let the array of drive structures come after that. This would reduce memory usage as it would allow the TSR to retain only data for the actual number of drives used which most of the time will be a lot less than 18. It would also allow use of more than 18 drives though admittedly that would rarely be needed.
  2. Remove the stack. A stack of 256 bytes (or more) is more or less standard in DOS and since this TSR is using its own stack using only 64 bytes there's a good chance that the application calling interrupt 21h will have more free stack space.
  3. Remove the C-style calling convention of calling procedures with parameters on the stack. Use global variables instead (see my changes to the getTotalClusters procedure for example).
  4. Inline all procedures unless they are called from several places (initdrive is one example).

Thanks for the great work!

@ChartreuseK
Copy link
Owner

Hey, sorry for the long delay in responding to this. I've been quite busy and hadn't really had to look into this.

I can't merge this pull request directly as you've completely changed the formatting of the code, the original formatting is designed to fit nice with 8 character wide hard tabs and fit onto an 80 column screen. I'm doing my development on the XT itself so being able to easily read the code and comments is a concern. Some changes also break compatibility with TASM in favour of NASM. Such as changing explicit size and types for "pointers" to implicit ones: mov BYTE PTR [someaddr], al is required in TASM whereas NASM is fine with mov [someaddr], al since AL implies the size.

As for the optimizations, I see a couple that I could use, though it's a bit hard to spot them in the clutter of all the changed formatting. Mostt of them are occurring in the part after init label which is the split between the code in the actual TSR, and the initialization code that loads the TSR. Anything after init is removed from memory once the TSR is installed, so performance and memory usage beyond that point is rather moot.

For the possible optimizations:

  1. Moving the interrupt handler around would be a fair amount of work, it can be done and I was contemplating that, I'd need to tell the assembler to assemble the code as if it was occurring at 5C rather than 100h, then move it during the init. It'd save a bit of memory footprint so perhaps I'll do that at some point.
  2. I'd personally rather keep the stack, though I can certainly optimize its size to be the maximum my program needs. The reason being that I don't want my program to break any other program that may get interrupted when its stack is overly small. It'd possibly introduce bugs and break compatibility to do so.
  3. The only function that'd make sense to drop the C calling convention would be getUsedClust as it resides in the TSR part, the other functions are only part of the initialization code and their footprint doesn't matter. It'd only maybe save a handful of bytes at most but it could be done.
  4. Like 3 this doesn't really make any sense. The only procedure within the TSR itself is getUsedClust, and the only reason that's not inlined was to improve the code clarity. I could inline it and save those few bytes. The other functions however would just be making the code unreadable for no benefit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants