

An indisputable use-case for supercomputers is the computation of next-day and next-week weather models. By definition, a next-day weather prediction is utterly useless if it takes longer than a day to compute. And is progressively more useful if it can be computed even an hour faster, since that’s more time to warn motorists to stay off the road, more time to plan evacuation routes, more time for farmers to adjust crop management, more time for everything. NOAA in the USA draws in sensor data from all of North America, and since weather is locally-affecting but globally-influenced, this still isn’t enough for a perfect weather model. Even today, there is more data that could be consumed by models, but cannot due to making the predictions take longer. The only solution there is to raise the bar yet again, expanding the supercomputers used.
Supercomputers are not super because they’re bigger. They are super because they can do gargantuan tasks within the required deadlines.

At this particular moment, the people of Minnesota are self-organizing the resistance against the invasion of their state, with no unified leadership structure in place. So I wouldn’t say it’s always mandatory.
Long live l’etoile du nord.