It’s great that there are so many services out there for Lifestreaming, but I also envision a future where I can query and view my data using local tools in a more flexible manner outside the cloud.
I have written about various methods that can be used for archiving or caching our Lifestream data. Most of these methods are based on self hosted blogging or CMS platforms that take the imported stream data and create new records in the local database via plugins for archiving.
Konstantine Thoukydidis has come up with a method to run his Lifestream feeds through Google Reader and leverage the historical storage to provide an archive of his self hosted Lifestream without using a local database. He describes the method in detail which includes using SimplePie and some custom code. I don’t see any mention of time-stamping issues with the imported feeds. I would imagine there would be some slight issues there. Either way I have to say that this is a great creative way to to get this done.
You can read it all Here
It would be cool to see the guide continue with a method to export the data as well. I haven’t tried the “offline” options for Google Reader, but think that might be a method to generate backup archives as well.
Hah! I don’t believe you linked to my professional page (which is out-of-date atm by anyway) 🙂
As for the time stamping issues, there are none that I’ve noticed since Google Reader keeps the original timestamp and provides that. You can see in my lifestream that the actions are placed in the correct order.
Also another bonus, that I did not mention in my article, is how google reader will serve your feed, even if for some reason your original service provider goes down (now or in the future) as is what happened with my comment logger (getboo)