I’m going out on a limb here and talk about an idea I’ve been thinking about. I’m not the first to come up with this, but other people in my field have started talking about it.
How can the Weather Enterprise take advantage of AR and VR?
I love to follow tech blogs and on my daily (one our each way, yes, you read that right) commute, I listen to a variety of “pod” casts. (I use Overcast. Not because it has a “weather-y” name, but it’s my favorite podcast grabber and player). In 2017, the talk is all “AR” and “VR”. AR really took hold in 2016 with “Pokemon Go” which my kids played with for about a month before going back to Minecraft and Roblox (BTW, Roblox, if you create a VR world then you’ll have my daughter hooked).
We love maps and displays in the weather enterprise. What would we have if we didn’t have a CONUS map with a nice satellite loop, radar mosaic, observations and NOAA SPC convective outlooks?
But…what if we could hold up our phone and see 3 or 4 dimensional wind patterns? I’m not a golfer, but what if a golfer about to tee up a drive wanted to see the realtime wind patterns to help align his/her shot? Or the fishing boat captain seeing how far he is from the SST gradient/canyon? Storm chasers aligning their viewfinders with the best convective initiation? My windshield popping up a “hazard ahead” alert showing me the bearing and distance from when I’ll drive into a snow squall? (Would that have saved lives and crashes on I-83 south of York, PA last winter?)
Of course there are challenges with respect to processing “real time” data, to layout in a 3D or 4D display, and what people would actually use (someone asked “Why would we need to see it on the screen if we can see the rain approaching with our own eyes?). We are challenged enough to get the right information to people (phone alerts, apps, sirens, etc). Would this be just another complicated layer of information?