There are currently no active tropical systems in the North Atlantic.
Saturday, April 20th, 2024 8:23 Z Apr. 20, 2024 8:23Z

2013 Site Updates

Some links and images are no longer available in our older site updates. Additionally, some information is no longer applicable.

For the latest site updates, click here.

November 24th, 2013

Over the past month I have continued adding historical recon data to the developmental recon system. I have now completed that step, adding missed data and also adding non tropical RECCO messages. Google Earth mapping will remain unavailable until I have rewritten more of the decoders and then create the Google Earth map files. Most obs cannot be automatically decoded in the archive or even decoded in the manual recon decoder. I have done that on purpose. I would rather rewrite each decoder first and only release a decoder into the new system when it is ready to be tested. Google Maps recon mapping and the live recon system will appear in mid 2014 when the transition is fully made to the new recon system.

I have also been working on some shared systems between the model and recon system. I will also be continuing work on the model system as well. I am rewriting a lot of it, but I am likely to hold off on rewriting parts of that until later in 2014 so that I can get the new version up and running sooner. I still need to redo the Google Maps mapping for the ATCF fix feature in the model system.

October 17th, 2013

A massive amount of work is underway on rewriting both the model and recon system for 2014. The systems can share so much in common when it comes to how they are written and therefore both are being rewritten at the same time. I hope to have both using the new code base by mid 2014 for beta testing and public release of an early version of the new code. It may be 2015 before the rewrite is finalized and additional features are added.

In order for the code of the system to eventually be released publicly, it needs to make sense. Simply working is not enough. Most of the web based interface of the model system will be rewritten. In time, the code that processes the raw data will also be rewritten.

The recon system is a considerably more complex rewrite. The recon system is more likely to have parts released on the experimental recon site as the manual decoder allows for people to decode their own observations to test the new system. The new live recon system will not be online until likely the middle of next year as nearly everything else needs to be rewritten first, although some older decoders may not be rewritten until late 2014 or 2015.

The model system cannot be easily modified in pieces. There will not be any new features, other than the previously mentioned consensus line feature on our model backup site, when the rewritten system is released. The new code base however will allow for future improvements to be easier to implement. Any new features will likely not be available until 2015.

Both new systems are completely different from the operational versions currently on our site. As a result, aside from bug fixes there will not be much of any updates to the operational model and recon systems on our site.

While I may not post updates often, there is a lot of work going on. Since things will be slow to be released, it is simply not worth spending a lot of time writing about features that may not appear until next year. The only recent feature released publicly is a new sonde diagram on the experimental recon site. It will undergo further revisions to likely add some additional display options. That diagram is one of the things that will actually be shared with the model system eventually. The new dynamic nature of the diagram will eventually be adapted for the depth chart for AXBTs in the recon system and then in wind, pressure and model error diagrams in the model system. The same technology may be used for future features in the new model system as well in 2015.

For the past many weeks I have been adding historical data to the recon system that I am testing offline to fill in gaps the current live system has missed over the years. I have been adding non tropical RECCO messages and non tasked winter missions for prior years. All of this data has yet to be added online. I am going over tens of thousands of observations to check for completeness of the recon archive. This is a very, very long process. This helps to figure out what observations can be correctly decoded and which cannot be, so this is an important step before further rewrites are done to the recon decoders to make the code easier to understand and to work better.

July 27th, 2013

Our official model backup site now has our new consensus line feature. Our site will likely not carry the new feature until later in August since all model data will have to be recreated. There have been a lot of other changes to the models in the Google Map. Next I would like to rewrite the Google Map for the ATCF center fix system. First I might need to update the recon decoder for another AXBT format. After those two things, I will get back to rewriting the entire recon system.

July 23rd, 2013

While the consensus feature is not yet available, our official model backup site is now running the updated Google Model map as of 0Z on July 23rd. Rather than an XML file, the script on that site now generates a .json file. Since that site does not have archived data it is easier to release the update there first. The consensus will also appear on that site first, though it may be a week or two before that occurs. Then the ATCF fix system will eventually be updated on that site. Once all of that is complete, the update will be applied on our site and all old model data will be reprocessed.

July 22nd, 2013

Work continues on updates to the operational model system. It may still be a month before the update is released and all old model data is recreated. This morning, after several weeks of work, I was able to test part of the update. I was able to generate model data for a storm in the .json file format and load it into the Google Map. While there is still a lot of work to do, now that part of the major new feature worked this morning, I'll go ahead and reveal what it will be. (I was not sure how this was going to work, and it ended up being a lot more complex than I originally thought.) The update will allow a user to create a consensus of the models they select. If you select 10 models, they will be averaged to create a consensus line. There will be a lot of options regarding the line and I am still thinking up more. I have yet to implement any of the options. This morning was the first time I was able to simply see it work in the map. (It's always nutty to work on something like that for weeks and not see it in action during that time.) Another small update will display the actual forecast date of each model position when you click a point. For example, instead of just seeing something like 120 hours from the early cycle initialization in the popup window, you will see the actual date and time when the center is forecast to be at that forecast point. Late cycle models will have the accurate forecast date too. The Google Model map has basically been rewritten to work better, although visually there will be little change. I also need to update the map in the ATCF center fix system which I have not yet started. That requires redoing all the model data as well, so I don't want to recreate all the model data, which takes a long time, until I finish that.

It could be the end of August before the update is applied and I get back to work on the new recon system. I have learned a lot rewriting the model map. That knowledge is going to be a great help when I do the live Google Map for the recon system.

July 12th, 2013

Afternoon Update: We had some issues today with the update implemented this morning in the model system. It is now working again.

Morning Update: Over the next several weeks, perhaps longer, I am going to upgrade some of the code in the model system. A lot of the rewrites so far have been regarding the model data in Google Maps. That code is being significantly rewritten. I am changing how that part works by changing how model data is stored for use in the Google Map. I will also be updating the ATCF fix system so that rather than creating static HTML pages the content is loaded from a CGI script. That will allow me to update that map more easily. All of that will mean recreating all model data on the site.

The CGI scripting in the model system could use a lot of work too. Some or most of that might come later. I'm not sure yet. Working with the Google Map in the model system has helped a lot with how I plan for the recon system to work and I would like to get back to that soon.

One issue with why I want to improve the CGI scripting in the model system is due to problems like last night. The model system was not prepared for 03L to become 96L. It was in part, but due to an error previously where a file was renamed wrongly, I had to code the system to not allow this instance. Now it is coded to allow it again. This is what makes some of the code sloppy. I really need to go over all of it line by line and make it more legible and annotated with what is going on. While the code is a lot better than it used to be, it still needs another rewrite. Unfortunately this will delay work on the new recon system further into next year.

July 4th, 2013

I have continued to make code improvements to the model display in Google Maps. I am still working on improving the code. The model system and the new recon system I am working on share some basic code. Bringing both systems up to the same standard will help moving forward with the recon system. The new JavaScript code is written much better and therefore can be linted, minified and optimized much better. Plus, I am actually commenting in the code about what is going on which will help out other developers who use the code when it is released under the GPLv3 license in 2014. If you have any suggestions for the model display, or even the recon display, let me know.

The code improvements are also extending to the server side in the Perl scripts that run the system. Some changes have been made, and many more will continue to be made over time, to that code to make it better written. There will be no feature changes at this time, just updating the code so it looks better and in some cases also works better.

I have added a lot of new model names that were added/updated in May. I had already updated some, but there were still a lot of changes.

July 2nd, 2013

I continue to work some on the Google Maps display in the model system for model data. Some of what I have worked on in the new recon system I am working on offline has been applied to the model system, including the new NASA layers which make a good background for model data. You can also now refresh NOAA satellite imagery and also choose an opacity for the imagery. Try adding model data with a NASA layer and NOAA satellite image that is slightly transparent. I am also improving the JavaScript code which lets me get better practice at doing the new recon system better too.

June 29th, 2013

I have decided to improve some JavaScript code in the model system on the page that shows model data in Google Maps. It might take some time. After working some on the recon system I saw where the model system could use some updating. There will not be any new features, just some code updates, other than one new feature I just added. You can now add NASA Blue Marble (2004) or Earth at Night (2012) imagery as a background. Look at the bottom of the options panel for this new feature.

June 27th, 2013

I have rewritten the customizable satellite page. At first I just wanted to update the JavaScript code to be better and then make it compliant with HTML 5 which will no longer have the frame tag. Browsers will still support frames, but I wanted to go ahead and rewrite the page with an iframe instead. However, I also realized that there are a lot of color enhancements available through the NASA page. I added those too. If you notice any errors lets me know.

I also updated the model system to remove a piece of code I did not know how to use properly. It prevented pages with maps that were not full screen from working well on mobile devices.

As for the recon system I am working on offline, I found a few interesting things to add to it. You will be able to select NASA Blue Marble imagery from 2004, with the month you want, or Earth at Night imagery from 2012. These are great overlays that I will eventually add to the model system as well. The only other thing I worked on was making a button that can hide all buttons and panels. Then to show them all again you have to click the top right corner. I will continue work on the recon system offline now that I completed the updated satellite page.

June 21st, 2013

All Atlantic best track data from 1941 through 1945 has been recreated to reflect reanalysis changes. Eight other storms were also updated in the Atlantic to correct minor typos. You can read more about the reanalysis here or here.

Work continues on the new recon system. It it slow at the moment, more of a thinking phase at the moment. I have to think about how the Google Maps recon display will work. There is much more to think about than I thought. It is not that it is complicated to implement, just complex to fit everything into a single display that is not exceptionally cluttered. If you are viewing archived data and want to switch to live data, how should that work. If you are viewing live data and want to add for comparison archived data. Setting the interval of how often to check for data, possibly for each individual live mission you are viewing in the display. Displaying when data was last updated into the display, for each live mission. The Google Maps live recon display will most likely allow for multiple live missions to be loaded. Whether or not you will be able to add archived data as well to that, or possibly load multiple archived missions, is still questionable. (Perhaps, but you could always view it in Google Earth.) There would probably be a limit of how much archived data you can add. (If left unlimited, you could easily crash your browser.) There is just so much to consider, so right now I am working out how it will work.

I like the idea of adding previous vortex messages for a storm as well to the Google Maps recon display. I will probably add that. In the new system, though not yet coded, I also plan to have a file that has just vortex messages for all missions, all updated in real time as a new or updated vortex comes out. I like doing this manually myself, so I figure why not make it easy on myself to do it automatically. And, maybe someone else would like the ability as well.

Some other changes to the recon archive will be the addition to view mission data by mission, rather than only by ob type. Now that the file structure will be different in the new system, this is easier to do. Then you can view all the mission data for each product type on one page. (I don't know why I didn't think to do it previously.) I have not constructed it yet, I want to work more on the Google Maps parts first.

June 17th, 2013

The Automated Tropical Cyclone Forecasting System (ATCF) has not released best track data in the "tcweb" folder for 93L. It has instead only appeared in the best track folder ("btk") and model folder ("aid_public"). As a result, our site, and others, have had some issues adjusting. I believe the data should continue to appear now after a series of corrections. If 93L becomes a depression, some additional errors may occur if the "tcweb" folder still does not contain 93L. Our site is designed to attempt to handle such errors, but sometimes things go wrong.

Meanwhile, work on the recon system continues to go well.

June 15th, 2013 - Live recon in Google Maps will be possible

I have been able to perform a successful test in IE 7-10, Firefox and Chrome of the technique that will be used to power live recon in Google Maps. As previously discussed, a .json file will be created on the server containing recon data. That file will be zipped (compressed) so that there will be two files on the server, the .json file and the compressed archive file that contains the .json file. Using JavaScript the system will attempt to download the zip file, inflate it (uncompress it) using JSZip and then parse the .json file. If that fails, the system will use the method the model system uses basically, except the model system uses an XML file, and download the original .json file and parse that. The system will do this every so often, perhaps every 5 minutes. The JavaScript will load the .json file with appended value that contains part of the date so that the file will always be fresh and not cached when called automatically.

Over the past week since I found that using Google Maps would be possible I have started designing how the Google Maps recon display will look. I have decided to go ahead and try and get the display to work with a fake .json file. That will likely take several months. Then I can add to the decoders the code that will be needed for the live system to generate the data needed to create a real .json file. I have also not quite determined how Google Mapping will work for single observations that you decode in the manual decoder, especially for any observation other than HDOBs, which all have one set of coordinates that would be mapped.

As this next part will take a long time, I will likely not have anything to update in the developmental version for many months. Even when I have completed the Google Mapping part, it may not even appear in the manual decoder in the developmental version for quite some time.

June 7th, 2013 - About live recon in Google Maps

Over the past few days I have been doing a lot of planning regarding the new recon system. I realize that Google Maps for the manual decoder is going to be very much tied to the Google Maps for the live system. In a way that is obvious, but to make things easier in the future when updates need to be made, the coding of the two systems should be tied together as much as possible. With that in mind, I can't develop the Google Maps feature for the manual decoder without also doing the Google Maps for the live system. And, I can't release the new system without Google Maps mapping. That means I will not activate the live system in the new recon system until the entire system is complete. I thought about activating the new live system at the end of this year without the live Google Maps part but I have decided against that. Therefore, in a way the new system might be delayed some. In another way, doing it all at once might be easier. What I am getting at is that I have begun the development of the part of the new live system that will allow recon in Google Maps.

There has been a lot to think about, but I think I have settled on some of the main aspects of how it will work. I am now in the process of building the main parts of the JavaScript code that will interact with a demo of what the raw Google Maps file may look like for a mission. I will start there to see if it works. If it does, I can implement what is needed on the server side in Perl to generate the mapping file for Google Maps. This file will likely a .json file that will be read into JavaScript. An XML file may or may not be smaller the way I write things, but performance wise it might be better to have everything in JavaScript rather than XML and grabbing data out of tags and adding to arrays. I'm not going to do any benchmark tests most likely, but if things run slow I'll see about how it would work the other way. XML and JSON are fairly easily converted between. The file created by the server will be loaded automatically to continue to display the latest recon information live, without needing to refresh the page. After a lot of thought, I think this is the best route and I am going to try to reduce the file size of the content that needs to be loaded as much as possible. Ideally, the data would be served from a compressed .json file and inflated using only JavaScript using JSZip. If unsuccessful, it would access the original uncompressed .json file initially and subsequently for that session, which would end up putting more load on the server, but at least it would reduce the load on the server for most people. There would be extra JavaScript from JSZip, but I need to think about the demand over not too many loads of the recon data before it makes sense to download and use extra code to uncompress a compressed file rather than simply access an uncompressed file constantly.

Required imagery would be stored in an external JavaScript script in text form, using base64 encoding. Imagery would therefore be loaded all at once and not again and again. This differs from the Google Earth file in that all imagery is zipped and sent to the end user again and again as it is the best way to do it in that context so that data can be saved to view offline and our server is not hit by all sorts of image requests. This does not work best for when we do it in Google Maps however. All wind barbs and other needed imagery, like most product icons and all the different planes, will be stored inside a single JavaScript file and loaded only once so I can reduce the amount of calls rather than having separate imagery or having to call all the imagery again and again. No matter what is needed, all the imagery will be loaded. This works best as I have to think about more wind barbs being needed or a new product icon. If they switch missions, I might have another agency and need another plane. Also, depending on the plane orientation there are already two variations for each plane, though once I get into it I might be able to reduce that need by inverting a single image in JavaScript and then rotating to convey which direction the plane was last traveling in. (Although, that might not be possible as in IE 8 and earlier since I don't think I could invert an image so that means half the time the plane icon would really be in the wrong direction and I would need the image anyway. Would need to research more.) A sprite image I don't think could work well for all the images, especially when rotating so many wind barbs, and using base64 allows me to even reduce things a little further along by taking out repetitious text in some of the base64 text of each image. It is also easier to work with them in the JavaScript code. For vortex messages I will likely store a round circle that is blank. Using CSS I will likely absolutely position in the text to represent the MSLP at the surface. I could also perhaps even add an asterisk if it was suspect, or make that an option for people to choose. (I have not tried that quite yet, but if I can draw the model name in Google Maps in our model product I think I can do that same thing here.) Wind barbs would look just like Google Earth. Using an aspect of HTML 5, the canvas element will allow the needed rotation. This works in modern browsers as well Internet Explorer 9 and above. Windows XP users using Internet Explorer 8 or below would have to use another browser to view the wind barbs fully. In those older browsers the barbs will not be displayed. You will simply have ugly white circles. (Users who had something more recent than Windows XP could upgrade their version of Internet Explorer.) In regards to image size, Google Maps recon imagery will have a smaller file size, other than for wind barbs which are already small, since in Google Earth they need to be larger. When you click one in Google Earth it becomes larger. In Google Maps that doesn't happen, so the detail in each product image, and the plane icons too most likely, can be reduced, therefore reducing the file size.

Basically, the Google Maps version will look like the Google Earth version, only with more features. I have not decided on all the features yet. Perhaps SFMR surface wind speed barbs with the legend changing to reflect what you are viewing? Maybe. (Maybe you could even add something to the end of the URL to customize what you want to view) Another idea, custom legend perhaps where someone can pick their own color scheme for the wind barbs, which would automatically update the legend. You could customize the link to the Google Map so that your presets loaded. Another maybe. At the very least, you will be able to add and remove things like you can from our model maps in Google Maps and it will be like our Google Earth version, only with more features since JavaScript powers it and you can manipulate the mapping to what you want. For Google Earth, you don't have a whole lot of options. With Google Maps, you will have some, thought again I don't know what until I think about that more as work continues. To be clear, it would work like Google Earth, with clickable icons. Click a vortex, sonde, RECCO, HDOB or NOAA AXBT (when I am hopefully determine a way to match data up with a specific mission which should not be extraordinarily difficult) and you get what you would in Google Earth, a popup with all the information, including charts if applicable. (sonde and AXBT) The storm slice feature would not be in it, though I might make it so that you could switch to it in some way if I can automatically determine passes through the center or create your own pass possibly all within the Google Map. That is a little ambitious though, like being able to click the start and end HDOB barbs that the slice should be generated for. However, now that I think about it, that could all be done in JavaScript as I have all the data from the Google Maps file I need. That also saves bandwidth because I can load all the variables from the Google Maps file that I need to send to the chart I want to show. (I'm always thinking about how to do something.)

The likely plan is to have some HTML directly added to the Google Maps file for some parts of some, if not all, of some messages, especially like a vortex or RECCO. I don't want to build a JavaScript decoder. So, that would be stored in the Google Maps file as is. However, tables can be constructed in JavaScript. For example, an AXBT message can have a massive number of depth levels. I need them all. In Google Earth, if I do include every depth in a popup, I need to have the HTML for the table and include them all in what I need to use for Flash to show the diagram. Lots of bandwidth. (though the entire file is zipped from the original KML and renamed as .kmz) For Google Maps I can do the depth levels easily by simply listing the data in the file. Then, JavaScript can distribute it to a table and also add it to the Flash diagram easily. The sonde diagram would be the same way, with the addition of a few extra things perhaps from elsewhere in the message if some data is doubtful. I would probably handle the mandatory levels along with significant wind and temperature/humidity levels. Everything else for that and AXBT for example, would be included as HTML in the Google Maps file that came from the decoder on the server side using Perl. As for HDOBs, everything would be separate, with no HTML. I can distribute that info to popups and possibly for a storm slice if I do that. The benefit of not doing the HTML, just the numbers, is that I can also do the additional unit conversion in JavaScript rather than put it in the Google Maps file. Maybe the end user doesn't want the unit. Or maybe they even want another unit. That could be optional, where in Google Earth you can't do that.

Just sitting here typing this post I thought of something based on a conversation I had with someone about how they like our Google Earth models because you could add overlays to that and not in Google Maps. Maybe I might add the ability to position an image you choose on the Google Maps recon display, like enter the link to a floater and manually choose the latitude and longitude bounds of the image. (It might be possible, though the image would have to be made transparent to position it more easily, which should be possible in most browsers.) It could be under some kind of advanced options. I will add NOAA imagery like I currently have in Google Maps, but that is not great for most closeup imagery.

So that is a summary of the current aspects of the new recon system I am working on. There are still many months of work in addition to the above as well, which will take months, so it will definitely be at least the end of this year, if not early 2014 which is becoming more and more likely, before the new system is finally complete. Well, more like ready for live testing during the offseason. I still don't even know if all the above will even work.

I don't know now when the mapping for recon in Google Maps in the experimental manual decoder might be available for testing. For the manual decoder it will use a lot of the same JavaScript as the live system. However, it will not use data stored in the .json file (or compressed zip file containing the .json file) but instead use data in arrays that may very well mimic the .json file in some way so that I can use a lot of the same JavaScript code. (Just simply not using any live components, though it may still be in the main JavaScript file which will be minified. I'll have to see how big these files get.) Being that the parts are likely to be very integrated it might be a long time before I release the Google Maps part for the manual decoder for testing.

June 5th, 2013 - 2014 Developmental Version of New Recon System

It's always great to get feedback. With that in mind, today I have decided to go ahead and release the first tiny piece of the new recon system. It is highly experimental and meant only for testing and feedback. Let me know if you see errors or have a suggestion. (Note: links to info about various features are unavailable) The new recon system will still not be complete until likely early 2014. You can view it here. I am linking to the front page which has an error warning and a summary of what is and is not available. As a new feature is completed, and somewhat tested, I will probably add it. Among the first will probably be Google Mapping for HDOBs. I have now added part of the dropsonde decoder online so that it can decode NASA GlobalHawk dropsondes. It will likely be several weeks or more before anything else is added.

June 4th, 2013

Someone thought the lines in our Google Earth model product were too small. I have updated the thickness from 1.0 to 1.5 on all future models generated. Anyone not like the change? Would you prefer the spaghetti lines to have a thickness different from that of the full more plots? Let me know.

June 3rd, 2013

HDOB data for SENEX, an air quality study in the southeast U.S. over the next approximately six weeks that began today, will now appear in the non tasked section of the live system. The decoder needed a small update to allow the data to be decoded.

As for the new recon system, work continues offline. I have almost completed the new storm slice feature for individual messages. At some point I will work on something that will attempt to create storm slices in real time from each hurricane hunter pass through a storm, though I don't know how well that might work. My next step will likely be to go ahead and start work on Google Maps mapping in the new HDOB decoder. I rewrote the entire HDOB decoder, but I have yet to add mapping back. Again, all this is offline and not yet available. Then I will probably go ahead and finish the rewrite of the dropsonde decoder. Then probably on the AXBT decoder some. At some point I may go ahead and make available, as an alpha product, some of these new decoders on another part of the site for people to use for testing purposes. That would allow me to get feedback on the storm slice feature (URNT15 only, not yet for historic HDOB products), the Google Maps mapping for individual HDOB messages (which looks like the wind barbs you see in Google Earth), the updated dropsonde decoder (which decodes NASA Global Hawk data) and then also a better looking version of the AXBT decoder. Some of that would be the only part of the new system that would be available. Everything else would be incomplete and you would get various errors if you tried to decode other products in the new system. (I still have to rewrite all the other decoders.) The new live system would remain unavailable until the end of this year as everything else must be done before that can work and there is still a lot of work to do. Only the manual decoder would work. Perhaps I might make one old storm available in the new archive to get feedback on the new layout of it, but new data would not appear in that archive until the new system is complete and the new live system could be turned on. The current live system, archive and manual decoder would be unaffected by the release of any alpha product related to the new system.

June 2nd, 2013

It's that time of year again, when I go through the site verifying links. I have finally completed that task. I also verified all the overlays in our Google Earth overlay product which needed a lot of updating.

Now that all that is finally done I can get back to the new recon system. Speaking of the recon system, our current recon system now appears on another site. It is the only other site that hosts our recon system. When the new one is done I expect other sites will also like to make it available.

May 28th, 2013

Work continues on the new recon system. You can read a little more about it here, including a new addition that will appear late this year, storm slice for HDOB messages.

May 5th, 2013

There is a lot of work to do on the new recon system. The deeper I get into it, and oddly enough the more I get done, the more I realize I have to do. (Sometimes coming up with new ideas for new products.) The first draft of the new system will have more features than I thought, and therefore will take longer to complete. I now believe the first draft will not be ready until late 2013, with the system being completed in early to mid 2014. I am pretty much rewriting most of it and it makes sense to rewrite it now how I want it to be in the end rather than waiting on doing some of the new features. I am fairly confident that the live Google Maps system will not be ready until 2014. While I already developed some of it, I have yet to determine how it will access data in real time in an efficient manner. I need to think about that a lot, a whole lot, and while there are probably good methods to do it, more and more on how best to do it will be available the later I wait. However, some of what I already developed could be used in the new Google Maps mapping for individual observations. That simply displays the data, with no live component, and that part I already worked out way back in late 2011. Yes, 2011. I just finally rediscovered the work again today that I last worked on on January 6, 2012. Each individual decoded ob needs its own mapping component in the new system. The old mapping was awful and I am not using that. So I am left with doing the new system without web based mapping for each ob until the next draft of the system, which will have live Google Maps mapping I hope, or to go ahead and add the feature I figured out in late 2011 now which will give me a good idea of how the live Google Maps mapping will work, aside from the live component. Of course that adds a lot more I have to figure out before the first draft is done. I am going to rewrite the URNT15 decoder and work out the new mapping. I also have a new feature idea for the HDOB output and I think I will go ahead and do that now too since when I am rewriting the decoder I might as well write it how it will need to be to work for that idea too, if it works out. So again, lots to do. Another something I want in the first draft of the recon system will be a system to gather data from the NHC recon archive every so often as a backup of checking the same individual product files constantly which leads to missed obs. That might be more difficult than I think. I know the obvious way, get the entire directory listing and see what is new since the last directory listing I downloaded. However, then how do I know what files I already have and which I do not? It would mean having to get all the files since by the file name I do not know what they are. Then, I would need to either just go ahead and process all that data, or determine which data is already in the system and which is not and therefore needs to be added. Hopefully there is some other way. I haven't got into that much yet, but it is again something I want in the first draft of the system. Additionally, I would like NOAA AXBT data in Google Earth in the first draft as well. That requires additional steps on how to associate that with a specific mission since the precise mission information is not in the actual product. Basically, the second draft of the recon system would have live Google Maps mapping and perhaps GlobalHawk track data, both assuming I can work it out. The GlobalHawk track data is a little tricky because it is not easily available and involves downloading large files which would mean downloading infrequently, like every half hour perhaps. That data however could disappear from the web at any time or change formats, so that system is the lowest priority. The dropsonde decoder in the new system will be able to process sonde data in real time from the GlobalHawk UAV that comes across transmitted under UZNT13, so that is the most important data anyway. The actual track data and observations at every moment of the flight is less important way up that high compared to the sonde data.

Finally, I wanted to also say that the new recon system, like the model system, will be made available for free to weather professionals, media organizations, educational institutions and very popular websites. I do not know when the new recon system might be available for download, such as by the time of the first draft or final version.

May 3rd, 2013

Yesterday best track and model data was updated on our site for older storms in the Atlantic, East Pacific and Central Pacific. All data prior to 1990 was recreated along with all storm data, other than invests, for 2012. Select storms from the remaining years were also updated based upon best track data that changed since we last recreated the data. You can read about some of the actualy changes here. We updated all the older information rather than go through the old years and find what specifically needed to be updated.

In addition to working on the new recon system this month, some links to the sites we link to under Analysis & Forecasts will be updated. We did that some on the "Model Data" page, though the work is incomplete there and has yet to be done on other pages.

April 29th, 2013

A lot of work is being done offline on the web based interface of the recon archive. Some major changes are being made to how the data is stored. The new directory structure will require rearranging tens of thousands of files. (Update: I created the system to do that and shockingly it reorganized everything in the Atlantic and East & Central Pacific in under 4 minutes.) This will be done offline for now and when the recon system is ready later this year to be put online, the current system that is online will be completely replaced. Every part of the new system will involve data being located in a different location that in the old system. The entire web based interface of the system will change.

This particular organizational change is being done so that data can be managed more efficiently in the future. Products will be organized into mission folders rather than having mission folders organized into product folders. The change could potentially allow for various other files to be created regarding the mission that could be added into a mission folder where all the other product data would then be located for that mission. I'm not sure what will come of this move, but it will be more beneficial to do it the new way. However, it means a lot of work rewriting a lot of things and writing a system to move all the years of recon data to the new structure, but with the massive rewrite underway, now is the time to do it. When I first wrote the recon system, I organized data by product and then had mission data in each of those folders. Mission "01" would exist in the "URNT15" folder and then also in the UZNT13: folder for example. For navigational purposes based on how the web based interface works, that made some sense at the time. Even then though, the Google Earth maps were always put in the URNT15 folder (or whatever HDOB product type when it comes to older messages) because that is how things started out on our site. Eventually all the other products were in the Google Earth file and it was too late to rewrite the system to organize data differently without major changes. So now is the time to make a change. A Google Earth file can now be put in the main mission folder rather than putting it in an HDOB folder.

I have no idea when the new system will be moved online. I plan to move it online before a lot of new features are added, but with all the work to do before I can even think about new features, it will likely still be in the middle of the hurricane season before it is ready. The system will be very different internally than it used to be so it will need a lot of testing. The Google Earth files will still look the same and there really will not be any new features at first, it is just that the system will be rewritten with the ability to add to it in the future. Recon data in Google Maps will not be available at the start. That will be the main new feature that will come later. (I actually created how the wind barbs would work in Google Maps, but I have yet to decide how all the data would update in real time efficiently in Google Maps. There is a lot of thinking to do about that.) The only new feature at the start would probably be dropsonde data from NASA which was being rejected by the old system. (I'm talking about just dropsonde data from the Global Hawk unmanned aerial vehicle that come in under UZNT13, not track data. Track data is something that may come later in the year or next year.) Some other dropsonde data was also not getting processed that the new system will process. Non-tropical RECCO data will also be available in the new system, probably from the start, though it will take a long time to add all the historic data. I'm not sure when NOAA AXBT data will be in the Google Earth file. That may take some time and may not be available at the start. That system will run differently than the main part of the recon system as it will not be organized by storm. I will attempt to write something that tries to associate that data with a storm if possible, but I have yet to do that. That is where a file in the mission folder would help. I could come up with a system that would attempt to associate AXBT data elsewhere on the site with a mission for a storm. If that worked, I could add that to a file and then the web based interface of the system would know to have that data available for that storm on its storm page.

Upon thinking about it more, there will be some minor things here and there probably at the start. The last location reported by the aircraft will be on the front page in a Google Map. The front page will also be designed better when it comes to showing what recon is current. There will also be dedicated pages for aircraft where you can see the observations from that aircraft only. That is helpful if there are two flights going from the same agency at the same time. You can pick which flight you want to track and focus on those obs rather than keep checking to see which obs are from what mission. A lot of that I have yet to work out, but that is a core part of the system so I expect it to be released when the new system is ready. Additional improvements would likely be made to that in the future. One very important thing I have to do is add information to a file that will have all active recon in it. I have yet to do that. That information will be used on the site in various places. That is something I need to think about more. Current recon information on the top of every page perhaps?

I have done a lot of work on the web based interface in the past few weeks. Again, all offline. It is really taking shape. In fact it kind of is how it will look at this point, but there is still a lot of work to do to make everything work. Especially since now I am changing the directory structure of the data. It looks very much like the model system's layout. I am trying to make the user interface more friendly when it comes to the page where you access data. The page will no longer use actual frames, though it will appear as though there is. I am adding a CSS menu, like the main menu at the top of our site, that will give you other options, like viewing other data for the mission and accessing other main features of the recon system. Putting it in a menu gives it a little more room and makes things less cluttered.

Lots of work has been done and a lot of work yet to do! If anyone has any suggestions, let me know! It could be something simple. Questions like: Wouldn't it be cool of this did this or wouldn't it be nicer if this thing here went over there instead? That is the kind of thing I am doing in the web based interface, trying to imagine how people will use it and try to make everything easier to navigate through. Suggestions don't have to be limited to something simple. One thing I plan on probably doing is having a file created after each vortex message comes through which will have every vortex message in the storm in one file over the life of the storm. Then you can have them all in one file to open. I like to do that manually by opening all the previous missions or saving them in Google Earth as they happen so I can see the previous storm track based on vortex messages. An easy file to open would be much better though. That is actually something that might just be released when I actually do release the system online.We'll see. Again, a lot to do!

April 26th, 2013

Work continues offline on the web based interface of the new recon system.

April 12th, 2013

Work continues on the new recon system. I am currently working on the web based interface of the archive and how that will work in the new system. All the work continues to be done offline. While I hope to have the framework of the new recon system done early in the hurricane season, any new features will likely not be available until late this year.

Have a suggestion for our site's recon system? Some important decisions on how it will work in the future are continuously being made. If you have any suggestions please let us know.

February 13th, 2013

I was originally going to post an update on what part of the new recon system I am working on, but I keep working on so many parts, I'm simply posting an update saying I am working on the recon system. The new recon system will not be released until the middle of the year at the earliest. I know some dropsonde observations cannot currently be decoded at times even though they are valid. The dropsonde decoder I rewrote at the end of last year does work for them, but that will not be online until the rest of the system. The code of the system has changed too much and that decoder needs to be tested more before release. At the moment, I am working on the framework of how the new system will work and simultaneously working on all aspects of the live system. There is a massive amount of work to do over the next many months as I basically rewrite all the code and slowly add new features. By request, I will recreate some sections of historic recon data if needed. However, I would rather recreate all the data when I have more of the new system done. I am at times working on that system as well which recreates all the data on demand. However, while rewriting the system everything is broken in my offline version. I am slowly trying to rewrite everything that would allow me to add and also recreate data. However, that part uses most of the code of the live system so I am working on a lot of things.

Someone asked me about winter missions. As a reminder, our site carries winter missions as well. I have increased the updating interval for the live systems in the Atlantic and the East & Central Pacific. The new recon system should handle these types of missions better when organizing data. That is one of many things I am working on.

I am working on the recon system full time, so even though I don't post any updates, I am working on it all the time.

January 23rd, 2013

I have added back the East and Central Pacific recon system here for winter missions. I have resumed work on the recon system after a long break. I am working on restoring older Google Earth data. I am working on updating the mapping system that will be used to process all that old data. Older decoded data will continue to remain unavailable after that point while the new web based interface of the recon system is developed.