The post is to describe my syndication funnels. I use the bookmarking software known as Plikli (formerly, pligg and kliqqi) to create many inbound links to my sites in the blogosphere, which are replicated blogs on wordpress, blogger, and wordpress.com . The site I'm describing is about B2B News and Internet Marketing. Please feel free to create an account here, and if you message me I'll make sure that your links are SEO-follow, and that you get a feature on the front page. Also I can add your RSS feeds - this is a new site and needs quality outbound links to authority sites because this is an essential feature of SEO rank building.
The blog replication is done by editing each site, and then syndicating using social network sharing. I host regular blogs for free - put in your email for more info.
The hub site then gets spidered regularly, and any updates on the blogs are visted instantly by the crawlers - which doesn't happen normally - usually, it is a question of playing the waiting game, which could be days or weeks, before the search engines pick up on the fact that your site has been updated.
With pligg (plikli), your site will be added to the "real-time" indices of the search engine, which means that any link you post will be visited, depending on which version of the index, within minutes, or at the very least, the same day.
It's not always necessary to post frequently - but the search engines like regular posts, so if you post once a week, stick to that. Once and hour is fine too!
Make sure to update your front page. it's a well-known fact that when a spider notices an inbound link to your site, it flags your site for updates. But when the crawler visits your site, it ignores most of the URL and only follows the domain part. So, if your front page isn't updated when you add content, you can lose your chances.
One solution that partially works to ensure your latest content gets properly indexed is to create a sub-domain (you'll need a "wildcard SSL certificate") and submit that to be bookmarked. The essential think is to make sure your subdomain returns a 310 permanent redirect http code - this informs the search engine to transfer the ranking of the inbound link to the new target, so your subdomain ranking can be independantly applied to any deep link on your site!
The problem is that the engines notice this, and will demote your site to a subdomain network. You then get a single ranking across all your subdomains. You can tell if you are correctly applying this technique by watching the server logs - you will get strange requests for robots.txt that are appended to the URL of your post - and you can also create a dummy robots.txt here to satisfy the search engines.
Another way for your sites to get demoted is if your blog holds the same IP as your social hub. The spider keeps a record of the previous IP address (not the domain) of the referring site, and will disregard links as being reciprocal if they point to this IP. So, you need three IP addresses on your server - and you can assign domains to these, and ensure that a site on any domain only links outwardly to domains on one of the other IP addresses, and a domain on the third one contains inbound links to the first domain, and not the second one. And really, each domain should have a separate IP so that reverse DNS can work - and you won't be able to send email from a domain without the reverse IP.