Writing To A Text File.

 

I have an expert that I am running in tester that takes data and saves it to a text file. The EA simply takes various variables and saves them as string variables in a text file. It does this through the use of the FileAppend function. Attached is the code. What I have noted is that when the file exceeds 4GB, the file either gets overwritten or the data within the file gets deleted. In reviewing Process Monitor, I have noticed that in the "Result" column I am getting a message "Buffer Overflow". I also noticed that the file offset from the last write function goes from position 24,430,494 to an ending offset of 18,446,735,827,427,981,992.

Is there a limit to the size of the text file that the FileAppend is writing to?

Files:
 
ForexSurfr:

I have an expert that I am running in tester that takes data and saves it to a text file. The EA simply takes various variables and saves them as string variables in a text file. It does this through the use of the FileAppend function. Attached is the code. What I have noted is that when the file exceeds 4GB, the file either gets overwritten or the data within the file gets deleted. In reviewing Process Monitor, I have noticed that in the "Result" column I am getting a message "Buffer Overflow". I also noticed that the file offset from the last write function goes from position 24,430,494 to an ending offset of 18,446,735,827,427,981,992.

Is there a limit to the size of the text file that the FileAppend is writing to?

Well FileAppend is your own function, not a system function. The system functions don't appear to have limits explicitly defined but FileSeek for example, uses an INT offset. The INT being limited to 2GB as a 32 bit type might be the sort of problem you are having, although seeking to the end of the file is not actually using the offset. Your code is very simple and is not explicitly limited by INTs anywhere so the fact that it doesn't work is a strong indication that it doesn't work (rather than there being a bug in your code). I would suggest segmenting the data into 1GB files and sticking it back together later if necessary.
 

I've changed my mind. Having tested the FileWrite it is still going strong at 7GBytes.

int handle= -1;
//------------------------------------------------------------------------
int deinit(){
   if( handle != -1 )  // in case we close the script before it closes the file.
      FileClose( handle );
}
//------------------------------------------------------------------------
int start(){
   
   handle = FileOpen("bigFile.txt",FILE_CSV|FILE_WRITE);
   if( handle == -1 ){
      Print("Failed ignominiously");
      return( 0 );
   }
   
   for( int n=0; n<26; n++ ){
      string str= CharToStr('A'+ n);
      Print( "Working on " + str );
      for( int i=0; i<100000000; i++ ){
         FileWrite(handle, str + i );
      }

      FileFlush(handle);
   }
   
   FileClose( handle );
   handle = -1;
   
   return(0);   
}

It's quite possible that you can't read a file that big and therefore opening the file as read/write is causing the problem. I don't see the need to keep closing the file, but obviously if the system crashes or the EA gets reset, your data file will get overwritten unless you take the precaution of naming the file by date/time etc.

Personally I would still use smaller file sizes.

 

dabbler,

Thank you for your comments.