There are many different answers, depending on how exactly you want to use the output, as well as what assumptions you are making about what odd characters aren't in the filenames. The find command doesn't have an option to escape special characters, but if it did, it's choice of what to escape might not match the exact needs of your program. Considering that the only illegal characters in filenames are '/' and NULL, there are a lot of edge cases.
In my case, I wanted to process the file names as elements in an array in Bash, so I wanted something like:
FILES=( $(find . -type f) )
That doesn't work with spaces (or tabs, for that matter). That also kills the newlines from the find command, making them useless as separators. You can set the field separator in Bash to something different. Ideally, you would set it to null and use -print0 in find, but null is not allowed as a field separator in Bash. My solution is to pick a character that we assume is not in any filenames, like 0x01 (ctrl-a), and use that:
IFS=$'\x01'
FILES=( $(find . -type f | sed -e 's/$/\x01/') )
unset IFS
for F in "${FILES[@]}"; do
useful_command "$F"
done
Note the need to unset IFS to restore it to the default. That won't work with filenames with newlines in them, but should work with most other filenames.
If you're really paranoid, then you'll need to do a find piped to hexdump, split out the results to get all the hex values, and look for one that isn't in the results. Then use that value. I'm sure Johnny Drop Tables has files with every hex code in the file names. If you're paranoid, create file and directory names using all 253 legal characters and test. Probably the only solutions that would pass that test would be ones using 'find -print0' piped to xargs or a custom C program.