Interesting. Stephen Wolfram (yes, the Mathematica and Wolfram|Alpha guy) thinks computers should have their own web sites.
The idea is that instead of a person going to another person’s website for data, a computer could go to another computer for data.
Currently, we do something like this, and call it a mash-up (a site set up by a person that uses a PC to combine data from other sites) or a scrape (the gathering of data from other websites, by a PC directed by a human). Currently, I’m working on scraping NHL data for a project with Dave Berri.
This would certainly solve a lot of duplication and accuracy problems.
Via Marginal Revolution.