This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Tom: There are some errors happeningERROR - ucar.nc2.ncml.Aggregation - Cant make cache directory= /usr/share/tomcat6/content/thredds/cache/agg/HFRNet-AKNS-6km-hourly-RTV
that are fixed in the latest version of 4.2.5 Can you upgrade and see if its still happens? John On 4/4/2011 10:47 AM, tom cook wrote:
HI John, I was wondering, when I see this again, what files and logs should I gather before I empty the cache? Any other info you have would be appreciated. Thanks, Tom On Wed, Mar 30, 2011 at 1:17 PM, tom cook<address@hidden> wrote:John, thanks for the reply On Wed, Mar 30, 2011 at 11:12 AM, John Caron<address@hidden> wrote:Hi Tom: 1) what version TDS?Version 4.2.1 - 20101201.20282) is this an aggregaton? (you can send me the config catalog)yes, threddsConfig.xml& catalog.xml attached, let me know if you want more (we have an enhancedCatalog.xml and wmsConfig.xml too)3) are there errors in threddsServlet.log?yes, I've attached a text file with the errors from the logs4) when you "update the netcdf files multiple times over 5-6 hours ", do you overwrite? extend?I believe the the files are always overwritten. I'll double check and get back to you if I'm wrong. THANKS Tomits hard when its intermittent, we may have to figure out how to consistently reproduce. john On 3/30/2011 11:21 AM, tom cook wrote:HI John, I have been tasked with maintaining the HF Radar national network THREDDS server here at SIO. Rich has helped us set things up, and has been a valuable resource for learning this system. We have been having a frequent problem that Rich has been unable to help us with, which is occasional bad timestamps and occasional bad data served through THREDDS. An example of bad data being served through THREDDS: we are working with ASA& USCG to get data into their SAROPS product. They have complained that many of the data samples have had data in the wrong geographical location (ie ocean currents over land). I have noticed this as well using the Godiva viewer, and when I compare the Godiva map with the source netcdf file, it is not plotting the data in the correct location. This makes me think it is a problem with the data being served by THREDDS. The problem goes away after some time, as we typically update the netcdf files multiple times over 5-6 hours after they are created. I'm not sure if the updating of the netcdf files is the cause of the problem, but it is not something that is easy to reproduce. The problem we have with weird timestamps happens occasionally, and in random locations throughout the time index. It usually is just one sample here or there, and not successive samples. It almost always goes away when I empty the cache. I'm not sure if these issues are setup related, or issues we can script some workarounds to help. But I wanted to check with you to see if you have any idea of what is going on. Thanks for your time, Tom Cook Programmer/Analyst Coastal Observation Research& Development Center Scripps Institution of Oceanography