Yeah, I’ve set the limit to 90x90 but it’s possible that that only applied to locally uploaded avatars. However, we will restrict the size of avatars manually if need be. Anything that requires the author bar to become bigger will not be allowed.
I wouldn’t worry too much about hotlinking, Km2_33. If anyone hotlinks offsite to a site that isn’t designed for the purpose, then it is not our fault if they allow it. Every web server has the facility to disable hotlinking - if they don’t disable it we can only assume they are happy for it to occur.
Hey Ben, to answer your question, the site seems to be allowing avs of all sizes. That said, is this av okay? 'Cause I’d love to keep it.
perhaps the database allocates 6kb of space per member (that’s like over 4 thousand here) 6kb x 4k…automatically, on each database call – whether it is used or not? anyway, Google-bots really suck (the bandwidth)…the bigger the forum/web-site, the more Google sucks. they seem to spider 24/7.
I wrote about this when I first came to the site… but let me do it again
how bout deleting some of the 3000+ people who have 0 posts, and registered like 2 years ago? It may not save some space… but it’ll clear things up a bit.
This is actually a good idea! …only 1% of all forum members will actually post that first post, anyway…
Unfortunately that’s not possible as it would ruin our ranking in the search engines.
It might do, but this would not effect bandwidth - only storage.
Google have now promised to behave a little less eagerly!
There’s gotta be some code way around that…
but… I don’t code so I don’t know what that would be.
Unfortunately, you can’t get around this problem using code. Google has over 100 variables in its ranking algorithms and you either have those variables – like lots of good content – or you don’t.
You can however strengthen the variables you do have which will overcome a lack in other areas. Its not an exact science because googles algorithm are their ‘secret herbs and spices’ and secondly, constant changes in the importance of these variables (and other methods) render much experimentation invalid.
Personally, I don’t think getting rid of 3000 names is going to do much – particularly since you found googlebot to be the main culprit. There are other ways to optimize the site vis a vis ranking, but this is a big topic. PM me Ben/Obw and I’ll pass some points on.
grumble, grumble
Very well. Only I’ll have to use the old egg for now, as I have no idea where the other one went. Achieved somewhere, I’m sure.
I think if you are having google spider ILP, you should dump anything beyond page 10. Honestly, if it’s that old they can re-post.
Also, something another site I visit does is trim excessive posts. One fine example would be the 100+ page posts in mundane bable. You could cut that in half easily and I don’t think anyone would be offended.
Thanks everyone for your ideas.