[futurebasic] Re: [FB] STR#

Message: < previous - next > : Reply : Subscribe : Cleanse
Home   : May 1999 : Group Archive : Group : All Groups

From: "Terrald J. Smith, M.D." <tjsmith@...>
Date: Thu, 06 May 1999 21:15:39 -0500
> !!!!!!! That's a lot of strings. _Way_ too many to store in STR#
> resources and read into an array one at a time. Is there any particular
> reason you _have_ to use STR# resources? Could all that be in a data file?
>
> I'd use the "write array to disk" and "read array from disk" functions,
> if I had that much data. Using a hacked version of your current program
> plus "writefilerecord", create your data file (after loading the array
> from STR# once). Then take out all the STR# stuff and replace it with
> "readfilerecord". If it's static data, you won't need writefilerecord in
> the finished program.
>
> Here's the relevant parts;
>
> DIM gFirst%(1)                   'denotes start of 'to save to disk' area
> DIM gString1$(15000)
> DIM gString2$(900)
> DIM gLast%(1)                    'denotes end of 'to save to disk' area
> END GLOBALS
>
> LOCAL FN writefilerecord
>   DIM arraySize&
>   OPEN "O", 1, gFileName$,,gSaveVolum%
>   PRINT #1, gVersion$
>   arraySize& = @gLast%(0) - @gFirst%(0)
>   WRITE FILE #1, @gString1$(0),arraySize&
>   CLOSE #1
> END FN
>
> LOCAL FN readfilerecord
>   DIM arraySize&
>   OPEN "I", 1, gFileName$,,gSaveVolum%
>   INPUT #1, gVersion$
>   arraySize& = @gLast%(0) - @gFirst%(0)
>   READ FILE #1, @gRawData(0),arraySize&
>   CLOSE #1
> END FN
>
> This is liberally hacked from a much bigger routine, so it may have typos
> or bugs...
>
> Bill

The reason for using the STR# stuff was to not have to read each record over
the network and keep the STR# in the program...it did massively decrease the
speed issue, not just for the computer doing it but for those competing on
the network.  However, the crashes suck.  With your recommendations, I
decided to trash the STR# issue and use arrays.  I can load the entire file
(729696 bytes) fairly ok with the read file# issue but then converting to
the strings with a for/next loop is bad.  Admittedly it is still faster than
30 min for loading each record individually but I have a thing about seconds
instead of Min.

Can¹t figure a way to use Bill¹s Read File #1,@gRawData(0),arraySize&
suggestion.  Yes, I know that it could probably be used the way he
recommends writing and then reading it but I still have to have the ability
to read/write individual records in the file.  The records are not
staticŠsometimes they may not change for months or sometimes several times a
day.  This stuff is stored in an base datafile which is what changes
records.

Here is sort of what I am doing to bypass the STR# stuff, but wish I could
just MOVE the StrHndl& stuff to the DIAGNOSIS() without the loop issue.

DIM 49 DIAGNOSIS$(16000).......A global issue

part of the local fn:

DIM DATAFILE$,MODIFIED$,itemnumber$,OSErr
DIM itemnumber,Filesize&,StrHndl&
DIM loop,gOffSet&,Myloop

itemnumber=LOF(FILENUMBER,48):Filesize&=LOF(FILENUMBER,1):gOffSet&=0
StrHndl&=FN NEWHANDLE(Filesize&+5)
OSErr=FN HNOPURGE(StrHndl&)
READ FILE#FILENUMBER,[StrHndl&],Filesize&

this is what creates a speed problem...esp. on the slow computers

FOR Myloop=1 TO itemnumber
  AZ$=""
  FOR loop=1 TO 48
    AZ$=AZ$+CHR$(PEEK([StrHndl&]+gOffSet&))
    INC(gOffSet&)
  NEXT
  DIAGNOSIS$(Myloop)=AZ$
NEXT

Anyway, thanks for the ideas/suggestions of getting rid of the STR# mass, I
hope it will stop the crashes.

Terrald