The client would like to maintain their users that they allow into the portal by submitting an updated encrypted text file on a daily (or weekly) basis.
This was necessary in case someone was no longer with the company and needed to be blocked from accessing the portal.
This (four part) automated process logs, reports, sends emails; and, archives all processes for tracking purposes. This process can process
large files using single thread processing without any timeout error and the application will call itself until complete
on each 100 lines of processing. This application is dynamic within our client portal. Which means drag and drop my application
into any folder; and, it works. I had to be brief; but, here are the processes...
Process A: retrieve (via FTP) the encrypted text file, archive the file, and then delete the remote file. If failure, email administration.
Process B: if the new local encrypted text file exists, decrypt the file for processing; and, then re-encrypt
the file (with our encryption stamp) to send to our sister company via FTP. The decryption process is a stand alone process
called by the SQL Scheduler.
Process C was to process the file at 100 lines per thread. This application calls itself until
complete. Once the first 100 lines were processed, I passed the variables through the URL to recall the application until the end of the file is reached. Once fully processed, the application writes a report (size of file, process time start to end, profiles updated, profiles deleted, new, etc...),
emails the client with the report details, deletes the file (no decrypted information stored on our server); and writes a log.
Process D was to send the new file (re-encrypted by process B) to our sister company via FTP; and, write a log.