Heres an idea i came up with. You know the way you surf the net and forget to bookmark a site or you seen something and you cant remember where it was? Well how about your surfing through a proxy, namly squid. Well you use squid to find all the sites you have visited using something like squid sites i think its called. Then put the urls of the sites into a database and crawl them using a search program and you can find stuff you want. It would be handy! Also you could cache the pages every time it goes and searches and when you visit the page it loads quicker because its using the local cache. You could also have friends with this. This means if they have seen something you want to know about you can search their servers too. Ipv6 could be handy for this. It would be like each machine has their own search engine. The owners of the systems would be able to search their own networks and also the users pages. I want to try implement something like this. Im going to use an other search program and squid as the proxy. Then use squid sites as the app to see what pages you visited. Every few hours it would find out the pages you have visited and then, given you have a broadband connection, the search robot will go through the new sites. It will search through sites every night. More to come.