Merry Anime Christmas ^^


Merry Christmas for everybody, just kick some ass and have some gifts heheheheh ^^

Alternative to All in One SEO Pack



Well we do have a lot of wordpress installs (not only for blogging, wordpress has come a long way and its almost at the point of becoming a full fledged cms like drupal or joomla), so having a good SEO pack to take care of the little tidbits of SEO is almost mandatory, so using the 2nd most popular wordpress plugin "All in One SEO Pack", seems like a no-brainer, that is... it used to be, now more than ever its a nuisance, and moved from a very simple helpful plugin to a major bloated beast filled with plenty of idiotic choices, and i'll gladly name a few:

1. Constant Updates, ohhh i'm all for updates, but come on, put it all in a bigger update, instead of pushing the equivalent of nightly builds, unless its a security risk (that i have my doubts, with this kind of plugin), it should be pushed with big updates, i guess they do this because of my second gripe...

2. When Updating they Deactivate the Script, yep, you activate on the plugins page, but you have to always go to the plugin itself to activate, thats just moronic and abusive, pushing ads/promotion/donations, jeez

3. Bloated or Useless Functions, stuff like pushing added keywords on single posts/pages, its not only pretty much useless these days, but its kinda promoting keyword stuffing, using excerpts as descriptions, talk about added bloat...

4. Adding Idiotic Stuff, to the code in the head like...

<-- All in One SEO Pack 1.6.7 by Michael Torbert of Semper Fi Web Design -->

e in the head like... not only is it bloat to the code, but it also announces the freaking version, so if there is a security problem... yayyy

So as of now we are moving all of our wordpress installs from using the "All in One SEO Pack" to HeadSpace 2 and Platinum SEO Pack ^_^.oO( to see who performs better)

UPDATE: After a year i've done a more through Comparison of Wordpress SEO Plugins heheheh and yes we do have a winner ^_^

How to deal with Web Scraping


Hummmm since i have several galleries, one thing i encounter often is web scrapers, personally i don't mind if anybody takes the entire gallery home or even if they re-post it somewhere else, that is fine with me, this is the web, if i wanted the gallery to be private i would made it so, if its public and free then go right ahead...



However the content itself is not the problem, the problem here is that the vast majority of web scrapers has bad default settings or the users that use them put too aggressive settings, its not uncommon for the load on the server to go from 0.10 to 1 in a heartbeat or even go down, i know its partly my fault i personally like to restrict as little as possible the server or the software (i could use several methods to restrict connections or to ban ip's if there are too many connections), however because i don't, sometimes i get into trouble, so this is what i normally do.

First of all i have this on the .htaccces (mod_rewrite), that helps blocking most of scrapping software (unless it spoofs as a browsers hehehe):

RewriteEngine OnRewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]

RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR]

RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]

RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]

RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]

RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]

RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]

RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]

RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]

RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR]

RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]

RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]

RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]

RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]

RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]

RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]

RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]

RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]

RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]

RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]

RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR]

RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]

RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR]

RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]

RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR]

RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR]

RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR]

RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]

RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR]

RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR]

RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR]

RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]

RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR]

RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]

RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]

RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]

RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]

RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]

RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]

RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]

RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR]

RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR]

RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR]

RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]

RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]

RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]

RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]

RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR]

RewriteCond %{HTTP_USER_AGENT} ^Zeus

RewriteRule ^.* - [F,L]

I monitor the load of the server if the load as a spike for more than a couple of minutes i check the apache log, if there are lots of connections to the same site from the same ip, i ban this ip with .htaccess, adding this line:

Order allow,deny
Deny from 100.100.100.100
Allow from all

(100.100.100.100 is the ip on the logs) and check the load after a couple of minutes, if its down fine, if they jumped ip's, i'll do one of two things, if they keep on the same ip range i'll just block that ip range, like so:

Order allow,deny
Deny from 100.100.100.
Allow from all

If they aren't then i limit the connections to 10 on the cache or image directory (not on all the site), i know it will hurt all users but its better than nothing, just adding this line:

MaxClients 10

If it still persists, i'll just close the image directory,adding this line again on the cache or image directory (depending on the type of image gallery software you are using):

deny from all

So the site stays up as well as the thumbnails, just the images won't be accessible for a while, all of these are temporary measures, but for now, for me they do the trick, most of the times banning the ip is enough of a cure, and those i always leave on the .htaccess the other options i normally remove them the next day after the connections storm has passed, bottom line is if you want to scrape instead of bombing the server for an hour, make it so it downloads slowly during a couple of hours it makes a big diference and everyone gets what they want.

Downtime on Server 1

Seems Jaguarpc is having some problems with their network... grrrr this is becoming troublesome, its not the first time in this past few months there have been a couple (comparing with basically years of stability...) and they are always very vague about it, the server is rock solid, but the network performance is not on par, so because of this server 1 is down, so a lot of sites, including s2r.org are down as well ^_^' lets see how long does this drag.... and if i have to start moving to one of my other providers....

Micro Update

Humm sorry for the lack of updates, kinda going to change how this work, now all hostcult/s2r updates will come from here, and i'll push the older s2r blog to here hehehe, keeping it simple and nice to everyone, more updates in a couple of days, trying to improve performance on a couple of sites and opening a couple of new ones before August is gone hehehe ^-^

ImageBoard Spam List


Since i run several successful anonymous imageboards, one of the ways to prevent spam is to have a sort of spam list (i would say blacklist) of all the domains url's that are not able to be posted on the board (since one of the main reasons to spam the boards is to post links to fishing or illegal sites), this way this prevents them being posted by bots altogether.

So its an essential tool to have a clean board and to keep things clean, even if later on more automated tools like akismet or defensio are included, this is still a nice clean and fast way to keep most spam or idiotic posts from the site, so the file is a simple spam.txt file on the root of the site, one domain/site per line, this of course is most useful to other imageboard hosts that use the same system (wakaba or kusaba clone software), so here is our very own custom spam.txt list hehehe for free yayy 1131 domains (i'm actually thinking about making our own system, so anyone can add links and have the latest version for their board so their system is always protected heheheh)

UPDATE (from tentakle and neechan):

Uhh since the last post we have added a bunch more, so there ya go an updated spam.txt this new list has 1372 domains that are/were used in spam posting on imageboards, the old link as well hehehe, you can use something like WinMerge or TextOpus to help merge your existing spam file with my awesome one ^^

Faster Faster! FASTER!

Yep, been doing a lot of improvements, mostly speed and optimization, almost every site was improved, stuff like css compression and image optimization, also i've created a cnd/mirror system to help improve speed of mostly used stuff (layouts and stuff), so you will probably see an increased speed on most sites ^-^

on other news, been working on a couple of new projects as well as major reconstruction of tentakle and minor improvements on smutish ... ahhh think thats all

How to Change Hosts Files

The hosts file is a computer file used to store information on where to find a node on a computer network. This file maps hostnames to IP addresses (like 10.0.0.1 points to www.google.com). The hosts file is used as a supplement to (or a replacement of) the Domain Name System (DNS) on networks of varying sizes. Unlike DNS, the hosts file is under the control of the local computer's administrator (as in you). The hosts file has no extension and can be edited using most text editors.

The hosts file is loaded into memory (cache) at startup, then Windows checks the Hosts file before it queries any DNS servers, which enables it to override addresses in the DNS. This prevents access to the listed sites by redirecting any connection attempts back to the local (your) machine (ip address 127.0.0.1). Another feature of the HOSTS file is its ability to block other applications from connecting to the Internet, providing the entry exists.

So you can use a HOSTS file to block ads, banners, 3rd party Cookies, 3rd party page counters, web bugs, and even most hijackers, so here are some instructions to do so and some sites with already made hosts files (you just overite your own hosts file):

The host files location is:
Windows XP at c:\Windows\System32\Drivers\etc
Windows Vista at c:\Windows\System32\Drivers\etc
Windows 2000 at c:\Winnt\System32\Drivers\etc

There you will find a file named hosts (no extension), like we said above you can edit it with any text editor, and function is simple, you map ip addresses to hostnames, so the files will be mostly like this...

127.0.0.1    localhost
127.0.0.1    www.bad-spyware-site.com
127.0.0.1    www.site-with-virus.com
127.0.0.1    www.publicity-ads-site.com

if you want to add any domain, just add a new line right 127.0.0.1 for the localhost (this way when that domain comes up in the browser the browser will search for it on your computer and not online, because the hosts file told him that), so for example:

127.0.0.1    localhost
127.0.0.1    www.bad-spyware-site.com
127.0.0.1    www.site-with-virus.com
127.0.0.1    www.publicity-ads-site.com
127.0.0.1    google.com

so now if i put google.com on the address bar of the browser it will give me a blank page and google.com wont work anymore, if you want to delete a entry, just delete the line or put a # in front

127.0.0.1    localhost
127.0.0.1    www.bad-spyware-site.com
127.0.0.1    www.site-with-virus.com
127.0.0.1    www.publicity-ads-site.com
#127.0.0.1    google.com (google.com will work now)

so the idea is to use the hosts file to block unwanted or bad sites ^-^ clean and easy hehehe

Here are some sites that provide awesome host files ^_^ .oO (choose one of them)

Hostman : Its an automated hosts file updating software
Host File : Pretty cool and clean hosts file
Someone Who Cares : A compreensive hosts file
MVPS : A hosts towards blocking unwanted stuff

Updates and Transfers

Hummm, i've made serious improvements on Besigh it should work like a charm now ^_^ also a major clean up of Hentaish is on its way, as well as 4 new sites hehehe

Social Network Software Elgg Review


Yep, i've used Elgg social network software on 2 experiments, one live, another offline, both times, with versions 1.1, 1.2 and the new version 1.5, both times the software was lacking for my own needs as well as my users (i'll explain the good things and bad things bellow), most of all it seems to be a software directed to a specific view of what a social network software should be, more than being an all round software... more times than none, it mimics in some way the features found in facebook and/or myspace, instead of giving what i would think that small to midsize social networks need. Also i dont think most of the things Elgg lacks are core features, no but they are plugins that should have been developed by the creators and not users, since i totally agree with they'r notion of a plaform core + plugins for functions, one of the things i was most impressed with Elgg.

This is a combined List from 1.1 to 1.5 (i know some things were improved and are now much better on 1.5, but in my view are still not ideal):

Awesome Features
  • True Plug and Play System (The core can always be updated, cause both themes and plugins are totaly unrelated with the core), similar to wordpress and in my view the only way to create software that is meant to be augmented.
  • Basic Essential Social Network Features (User Profiles,Friends,Activity...).
  • Security and Protection of users Privacy in mind.
  • Even with more than 1000+ users, it seems to not be such a burden on the system and flows pretty well (i've seen SMF or Phpbb start crawling with the same load).

Not so Awesome Features
  • Serious lack of communication from the developers (reason for me to not use the software anymore in any live site)
  • No Central Forum System / Advanced Message Board.
  • No Picture / Video Gallery System (the file system is just a very basic form of picture/video gallery).
  • Authentication and OpenId still dont work perfectly, several users constantly reporting problems and logouts, on a system were you should login once and never again.
  • Several features are a little barebone (like blogging, files, pages, messageboards)
The results of using Elgg even with 1.5 is very little user engagement, even with high traffic flowing to the site, most users would register and then leave and never come back, mostly cause they dont have a central place (forum, video or picture galeries....) to find friends on the site (unlike facebook where you bring friends from the outside), while the same site now with SMF with very little extra functions (embeding and picture posting enabled) has in a very short time become a very active main forum with 4 times more page views and several times more engagement and time on site.

Also their upcoming Elgg hosted solution doesnt give me much confidence, more on their communication with the community than on the platform, we will see, if they do improve, i'm sure i'll change my mind, still awesome platform, poor functions ^_^

Quick Improvement and New Site

Hummm i've just moved panchira.org to a new server, in preparation for some changes (social features and video), also i've made several improvements in yime.org ( mostly around opensearch ) and launched besigh.com torrent engine ( kinda mostly did this one for me... i'm sick of all other hentai search engines that are always down or searching weird torrents, this way i have full control of content ^_^ ), also changed the look of this browser for a more clean a readable look hehehe

Spreadout Updates

Haiii haii been updating loads of sites and i have like... 3 or 4 new ones coming up, but for now... just re-did hostcult (yep our hosting domain), since we moved the hosting business to just hosting S2R projects, i didnt have time to re-do the domain and making something useful, well now i did, just a simple blog about hosting, both S2R and well the industry hehehe ohhh also i did a cool site called Smutish (its kinda an adult agregator of sorts), so check them out YAYYY ^_^

Welcome to the New Hostcult

Heheheh,wait? isnt hostcult a hosting company? ... humm it was ^_^ but since about a year ago, we refocused Send to Receive into a kind of web developing company and decided to just turn Hostcult into S2R solely hosting provider, so we pointed our customers to other cool hosting companies and some of them we still host (but this time for free >_< ), but since we have the domain (mostly used for dns and domain aliases), we migth as well use it hehehe, so what does this new hostcult bring?

Well mostly talk about hosting, news about the hosting industry, reviews about hosting (that we use), well stuff like that, also with hosting related subjects with s2r (so if any s2r site is down or there is any problem with our servers, we will post here updates, since this blog is hosted on google's blogger)...

Think thats about it for now (still fixing up the hostcult theme), more stuff soon enough hehehe

Land of Upgrades

Yep, major improvements on hentaish (i went through everything, css, html, code, posts, graphics), preparing for the reboot ^_^ also a lot of tweaking on neechan and tentakle, and last and not least some upgrades on panchira... mostly security and standard upgrades yayyy

Quick and Dirty Upgrades

Yep Yep been busy offline, but also online, doing major improvements on yime.org (more sites added, the search is much quicker and nicer), also improving neechan.org search, should be way better now (it has more than 300 other image boards and such...) heheheh integrated google's friends connect on ecchi.info and on kudasai.org, especially on kudasai it made a vast improvement in speed compared with using meebo's rooms ^_^

Neechan Woes and Fixing Spam!


Well we run neechan and tentakle (these are both anonymous image boards) and since like a month ago, we started getting bombed with CP , although completely against our rules, the idea of a image board is that anyone can post easily, so CP and other nasty stuff tends to show up from time to time, but that was not the case, this was continuous multiple posts a day, the idea is simple, to bypass our various protections, they just post a CP image with the url to visit on the image itself, so things like banning the domain aren't going to work as well as banning just posting images, cause that's a natural thing to do on a image boards, so what did we do?

Hummm we started by increasing the restrictions, adding captcha and starting to ban by image hash, however all of these the bots kept bypassing, old captcha systems can be broken pretty easily by software, images can be changed ever so slightly to bypass image hash, that was especially hard on neechan's wabaka captcha, that was easily broken, this got to a point that rightly so someone told google that we were hosting CP ourself, but only between the time of posting and the time it takes for an admin (me) to delete it, so clearly we needed to improve even more our code, so we hacked the code and started using re-captcha.

So did it work? actually yes, not only cause re-captcha is quite a strong captcha system, but mostly cause these bots are targeted to this software and its weaknesses, if i would change to a more simple setup i believe it would also break the bots, well until it was programmed to break that particular system, but this way, we can outsource at least part of our security to someone that knows captchas and thus solving our problem, so will we switch tentakle also to re-captcha, hummm if CP starts passing through it, absolutely ^_^.oO ( thank you recaptcha )

New Sites and Stuff

Just to say that panchira is back on and doing pretty well, also i'm working on reworking (funny) a couple of other sites, as well as doing major updates on a couple more... lots of new stuff this week hehehe ^^

S2R is Over with Feedburner

This was the last drop, the only reason for using feedburner in more than 20 of our sites was that it provided a good and reliable storage of the feeds, the ability to manage the feeds and possibly monetize it and above all else because of the feed stats, but in a week time since we moved the feeds from feedburner to google feedburner, the bad service is even worse, its a case of google doing again a shit poor job of a service they bought and in our case its the final drop, and we are dropping the service all together, we prefer the clean workable and always available feed that our sites provide than the broken down mess from google's feedburner, we are done and we recomend most users to drop them as well... really no point, you lose more than what you win...

Happy Christmas and Merry New Year

Aiiiiiii a Awesome Christmas and New Year to every random person on the Interweb hehehe ^_^