What part of your site is sensitive?
If it's just the data in the files, then overwriting the file with data from /dev/zero or /dev/random using a built-in tool such as dd should do fine, assuming the filesystem doesn't allocate new storage when overwriting a file.
If the directory structure itself is sensitive (filenames, for example), then you'll need a more thorough solution. I'm at a loss here, but you'd probably need to (post-overwrite) delete all the files in each directory, then create a bunch of files in the same directory using touch and deleting them before deleting the parent directory, using a depth-first algorithm. I know something can be made using find and such, but I don't know of a ready-to-go tool that does this.
In your case, shredding the files (overwriting the file repeatedly with random data) is overkill because nobody's going to try to recover your data by taking the physical drive apart and trying to grab track-edge data. I'm not sure that recovery technique even works anymore with modern high-density drives; overwriting data with zeroes may be more than sufficient.