Again, a newbie type perspective, but it appears that they have to much control over this data. I was reading their "Terms of Use" and it is incredibly restrictive. It takes direct aim at sites such as azgeocaching.com. Has the community thought about opening up an open database full of cache information, instead of submitting it all to a closed source? This really seems to be a community driven activity that is controlled by one entity. That might work out now, but there is no guarantee they will put the interests of the community ahead of their own personal interests. But like I said, I am still pretty new to all of this, so maybe I just don't understand the whole situation. -Art- > Art wrote: >> I am new to caching, but a long time perl coder. Have you asked >> geocaching.com about the possibility of a nightly XML dump, or a >> .net/SOAP >> interface? > > Yes.. over and over again.. They either say no, or promise the > possibility of it in the distant future. > Until then, we'll have to continue to pull a couple hundred megs of web > pages off their site to get about 10 meg worth of data. It would really > be worth their time and save them some money to just give us a data > export. > I think at this point, it's just about control of the data.... They > could continue to have that with a license agreement, but oh well. > > Brian Cluff