Gary Illyes shared a pleasant little tidbit on LinkedIn about robots.txt recordsdata. He mentioned that solely a tiny variety of robots.txt recordsdata are over 500 kilobytes. I imply, most robots.txt recordsdata have just a few traces of textual content, so this is smart however nonetheless, it’s a good tidbit of information.
Gary checked out over a billion robots.txt recordsdata that Google Search is aware of about and mentioned solely 7,188 of them have been over 500 KiB. That’s lower than 0.000719%.
He wrote, “One would assume that out of the billions (sure, with a ) of robots.txt recordsdata Google is aware of of greater than 7188 can be bigger in byte measurement than the 500kiB processing restrict. Alas. No.”
Yea, the web optimization level right here is that Google can course of as much as 500KB of your robots.txt file however most of these recordsdata do not even come near that file measurement.
Discussion board dialogue at LinkedIn.
#Google #Robots.txt #Recordsdata #500KB