Getting Web Files

Posted by Peter Howes on October 31, 2007

Hi All

I use the following code to download a web page. Once downloaded I parse the page to extract the information I need. This process is performed every 20 seconds. However after the first download the data within the page does not change even though it may have changed on the web site. So it would appear that my routine is accessing a (local) copy of the page as opposed to the web page itself.
Is there any way around this? Thanks.

File Download Code:

function GetInetFile (var fileURL, FileName: String): boolean;
  const BufferSize = 1024;
var
  hSession, hURL: HInternet;
  Buffer: array[1..BufferSize] of Byte;
  BufferLen: DWORD;
  f: File;
  sAppName: string;
begin
   if FileExists (FileName) then deletefile (Filename);
   
   if OnLine then begin
     Result:=False;
     sAppName := ExtractFileName(Application.ExeName);
     hSession := InternetOpen(PChar(sAppName),
       INTERNET_OPEN_TYPE_PRECONFIG,nil, nil, 0);
     try
       hURL := InternetOpenURL(hSession,
         PChar(fileURL), nil,0,0,0);
       try
         AssignFile(f, FileName);
         Rewrite(f,1);
         repeat
           InternetReadFile(hURL, @Buffer,
             SizeOf(Buffer), BufferLen);
          BlockWrite(f, Buffer, BufferLen);
          if Bufferlen > 0 then Result:=True
        until BufferLen = 0;
        CloseFile(f);
   
      finally
        InternetCloseHandle(hURL)
      end
    finally
      InternetCloseHandle(hSession)
    end
  end
end; 

Follow Ups